Who’s Auditing the Metrics? MediaSense Just Did—and AU Passed
In a market crowded with metrics, it’s easy to make claims. Proving them is harder. That’s why we asked MediaSense, a global media advisory firm, to conduct an independent, third-party audit of our omnichannel AU metric and its methodology.

We believe media quality metrics should be trustworthy, verifiable, and directly tied to outcomes. While we don’t publicly disclose algorithm weights used when calculating the AU scores—intentionally limiting exposure to the kinds of gaming that have undermined other metrics—we believe in accountability. To back that up, we gave MediaSense access to our methodology, assumptions, and internal processes so they could independently assess whether AU truly does what we say it does: predict meaningful brand outcomes.
What MediaSense Set Out to Evaluate
MediaSense focused on three core hypotheses:
- Is AU’s scoring methodology appropriate for its objective? Are there alternatives that should be considered?
- Is the AU Curve Materiality Methodology robust, given its objective of helping advertisers set optimal AU thresholds based on visualized outcomes? Can MediaSense independently rebuild Adelaide’s AU curves using sampled data and confirm that increases in AU are associated with improved campaign outcomes?
- Can the AU Curve Materiality Methodology be used consistently and reliably to make outcome-based conclusions for Adelaide’s clients?
MediaSense’s Approach—No Black Boxes In Sight
Between February and June 2025, MediaSense conducted a rigorous review of Adelaide’s AU methodology and AU Curve Materiality Analysis. The process included technical walkthroughs, documentation review, and an independent reconstruction of AU curves across four KPIs: Aided Awareness, Familiarity, Recall, and Intent.
MediaSense scrutinized every assumption, caveat, and input behind AU. Then came the real test: rebuilding the AU curves from scratch using their own data science approach and our sampled data. The goal wasn’t just to replicate outputs but to validate that AU’s predictive power holds up under fully independent analysis.
What They Found
- The AU methodology is outcome-driven and follows data science best practices.
MediaSense confirmed that AU scores are strong proxies for a placement’s likelihood to drive brand outcomes.
- The AU Curve Methodology is transparent, reproducible, and independently verifiable.
MediaSense successfully rebuilt the AU curves, demonstrating that AU is not a black box. Its statistical foundations are robust, traceable, and repeatable, offering advertisers a trustworthy framework for assessing media quality.
- Higher AU scores reliably correlate with improved performance in advertiser KPIs.
This predictive value holds across multiple KPIs and channels, supporting AU’s use as an input for media optimization.
Why This Matters
Brands and agencies are right to be skeptical of new metrics—especially when so many have been burned by outdated or imprecise measurement tools. Attention metrics like AU offer a real opportunity to reshape how the industry measures, values, and transacts on media, but that potential relies on sound methodology, consistency, and independent verification.
MediaSense’s audit confirms what we’ve seen across hundreds of campaigns: AU works. It helps advertisers plan, optimize, and invest in media that delivers real impact, providing a precise measure of media quality that both buyers and sellers can trust.
Download the full MediaSense report for an in-depth look at their methodology and findings.
