attribution

The Measurement Dilemma — Navigating Privacy-Driven Disruption

Changes in privacy legislation, the deprecation of the third-party cookie, and new rules on Google and Apple platforms have set the stage for the impending data disruption in the advertising industry, as outlined in IAB’s State of Data 2022 report and OptiMine’s overview on Google Topics. Both presentations and the subsequent panel discussion in this Insights Studio session emphasized the unavoidable impact the loss of individual tracking will have on measurement and attribution and urged marketers to act quickly to prepare for the effects on revenues.

Tackling the Challenges of Local OTT Attribution

Stu Schwartzapfel of iSpot.tv and Traci Will of Gamut talked about dealing with the challenges of local OTT attribution. While national brands have been a fixture on streaming platforms for a while now, smaller and local brands are just beginning to dip their toes into this space. Industry challenges include a lack of standardization of measurement, a national versus a local focus and careless measurement which can have confusing results.

Dynamic Addressable TV Advertising over the Customer Lifecycle

Rex Du (University of Texas at Austin) explained that targeting addressable TV ads over the customer lifecycle enables the brand to benefit from long-term customer profitability. This methodology provides stronger outcomes than targeting to maximize same-day incremental conversion.

Leveraging Look Alike Models when A/B Testing isn’t an Option

It isn’t always possible to perform A/B tests when it comes to evaluating the impact of paid media campaigns. Caroline Iurillo and Megan Lau of Microsoft outlined the company’s development of a strategy which matches campaign exposure data with a customer database and then creates “look-alikes“ for non-exposed customers to make audiences as comparable as possible. Lifts in perceptions, behaviors and revenue can then be compared (in aggregate) amongst exposed customers and their non-exposed look-alikes to determine the effectiveness of a campaign.

Navigating Through Uncertainty with Next-Gen Marketing Mix

Greg Dolan (Keen Decision Systems) and Mark Bennet (Johnsonville Milwaukee) examined how to navigate in uncertain and volatile times in the current marketplace using next-generation marketing mix solutions. In the opening, Greg explored the progression of marketing from the late 90s, through what he dubbed “The Roaring 20s.” He noted that we went from a minimally complex, slower-paced “top-down” approach in the 1990s to a very fast-paced, complex environment with a shift to Retail Media and a unified approach where we apply next-generation predictive analytics in the 2020s. Greg discussed the intricacies of their approach of combining historical data with predictive/prescriptive plans to address drastic changes in the current environment, leveraging ML. He provided a case study that demonstrated the successful application of the next-generation marketing mix. In addition, Mark gave a client perspective on how they are handling market uncertainty.

Harnessing the Full Potential of Marketing Mix Models: How Attention, Creative and Audience Personalization can Drive ROI

Sameer Kothari (PepsiCo) and Todd Kirk (Middlegame Marketing Sciences) examined the application of a transformed rendition of marketing mix modeling created through the development of a proprietary system called the “ROI Engine.” Sameer indicated the desire to harness the “true potential of marketing mix models even beyond measuring past campaigns and using it for strategic planning looking forward.” Sameer discussed this system as having a more “predictive ROI outcome-based approach” by “leveraging an ecosystem of leading indicators for before and during a campaign flight.”

Rebuilding MMM to Handle Fragmented Data: The Challenge of Retailer Media

Liz Riley (OLLY) and Mark Garratt (In4mation Insights) explored rebuilding and reimagining marketing mixed modeling (MMM) to better handle fragmented data, in the era of retail media networks. Mark lauded MMM as an effective technique that has contributed to financial success for many businesses. In light of data becoming increasingly fragmented, he suggested that “some reinvention of the fundamental model framework is going to be required in order to move this old venerable method into the future.” Mark and Liz examined the Bayesian approach to MMM in handling fragmented data. Mark noted that there will not be a situation “where all the data is the same granularity in one place at one time.” The Bayesian approach can “fill in the blanks” of missing or fragmented data using reasonable estimates, creating a more accurate picture, which traditional MMM falls short of in the retail media environment.

Panel Discussion

Michael Cohen (Plus Company) moderated this panel with the day’s speakers on subjects ranging from overcoming the challenge of non-representative samples, data validation, change management to the impact of AI

Modernizing Attribution for the Future

The marketplace is rapidly changing and the industry faces challenges due to digital disruption, marketplace fragmentation and consumer privacy issues. Nielsen’s Katie Koval explained that the audience identifiers of today, such as cookies, will not be useful in the future. As a result, according to Nielsen’s Annual Marketing Report, in aggregate, 46% of marketers are not confident in the measurement of ROI across digital channels.