leveraging data

Prior Attentive Ad Exposures Increase Ad Attention

Tristan Webster and Kenneth Wilbur showcased their most recent collaborative work examining attention and frequency in advertising: the impact of multiple exposures on people’s attention to TV ads. They applied CTV data which TVision has collected natively in the field to provide insight into the long-examined question, “Is there an optimal frequency for TV ads?”, but more granularly: “What is happening in the media environment while viewers see ads, and how does that affect their attention?”

Building Trust Through Transparency

In a 2019 Pew survey, 79% of Americans say that they know big companies track their online behavior while 59% say they don’t know how the data is used and have little control over how the data is used. In a 2020 survey, Pew found that most Americans want the right to permanently delete their health related data.

Cookieless Audience Targeting and Attribution: A Pharma Case Study

While attribution has been around two decades, a great breakthrough for digital, the deprecation of third party cookies is likely to have a significant impact. Options to MTA include: walled gardens, focused on one channel only, and data clean rooms such as those operated by the identity companies. Cleanrooms are data intensive but much better than single channel walled gardens.

Advertising’s Sequence of Effects on Consumer Mindset and Sales

The academic study at the heart of this presentation compared 13 hierarchy-of-effects (HoE) advertising models to determine which model matters the most, what moderators are most prominent, and what factors and sequence are most important in driving sales. Understanding the sequence of effects is most important for advertisers and marketers as they build their campaigns.

MODERATED TRACK DISCUSSIONS: Cross-Platform: Measurement & Identity

During the discussion, Duane Varan (MediaScience) and Steve Bellman (MediaScience & Ehrenberg-Bass Institute) discussed the metrics they used, the implications of their research for media planning, and the results of their research in the U.S. Sean Pinkney (Comscore) talked about the lessons he had taken away from his analysis of missing data.

Concurrent Track Panel Discussions: NEW METHODS TO VALIDATE AUDIENCE ESTIMATES

This live panel featured the presenters from the New Methods to Validate Audience Estimates track, with moderator Megan Margraff of Oracle following up on key points involving alternate data currencies, data harmonization and normalization, fragmentation challenges and advanced targeting in TV and CPG.

Bridging the Gap Between Linear and Digital Measurement

Integrating linear TV in cross-platform measurement was a challenge undertaken in a partnership between Lucid and Samba TV utilizing ACR (automatic content recognition) and STB (set-top box) data matched to survey responses. Stephanie Gall (Lucid) and Karen Biedermann (Samba) shared details on the inherent problems, potential solutions and biggest learnings from this integration.

The Exploding Complexity of Programming Research, and How to Measure It, When Content is King

Programming researchers are not getting the data they need to make informed decisions and Joan FitzGerald (Data ImpacX) uses streaming’s complex ecosystem to explain the conundrum facing programmers. Key insights into monetization and performance are not supported despite the inundation of new forms of data, leaving programmers without a comprehensive picture of their audience. Together with Michael McGuire at MSA, Joan outlined a methodology funnel that combined 1st, 2nd and 3rd party data to create equivalized metrics that, once leveraged, could meet critical programming research demands.