attribution

Concurrent Track Panel Discussions: ATTENTION MEASURES

These presenters were all true believers in the value of attention. Their key takeaways from the presentations in this track were:

  • Attention is “ready for prime time,” as Marc Guldimann (Adelaide) put it. It has risen to prominence in the industry’s agenda and expects it to spread into media mix modeling and programmatic. Attention, he believes, should free the industry from “invasive” attribution practices by giving advertisers confidence in the quality of the media they are buying.
  • Jon Waite (Havas) was encouraged to see attention move from theory to practice for optimizing campaigns. He believes that the focus on attention would encourage publishers to improve experiences on the web, which, in turn, would lead to better results for brands.
  • Mike Follett (Lumen) cautioned that there was still much to learn about attention in different contexts, flighting, frequency, differences between B-to-B and B-to-C, the role of creative and long-term effects. What he found interesting in Joanne Leong’s presentation (to which he contributed) is the possibility of developing models that can predict attention for any campaign.
  • Publishers have come up with innovative formats to optimize for attention on television, according to Kelsey Hanlon (TVision).

 

There was some disagreement among the panelists about the prospects for an attention currency.  Marc saw it as an “obvious next step.”  Mike regarded attention as more of a buy-side “trading tool.” Jon said that it will become a key planning metric for Havas.

FORECASTING 2022: How Can Scenario Planning Improve Agility in Adjusting to Change?

On July 12, 2022, forecasting, and product experts shared frameworks and strategies for participants to consider as they plan amid disruptions in the industry. Presenters discussed techniques marketers could use to drive consumer action and advocacy — as well as econometric models for search trends, insights on holistic analytics programs, reflections on gold standard probability methods — and new forecasting techniques in the wake of the pandemic and more.

The Measurement Dilemma — Navigating Privacy-Driven Disruption

Changes in privacy legislation, the deprecation of the third-party cookie, and new rules on Google and Apple platforms have set the stage for the impending data disruption in the advertising industry, as outlined in IAB’s State of Data 2022 report and OptiMine’s overview on Google Topics. Both presentations and the subsequent panel discussion in this Insights Studio session emphasized the unavoidable impact the loss of individual tracking will have on measurement and attribution and urged marketers to act quickly to prepare for the effects on revenues.

Tackling the Challenges of Local OTT Attribution

Stu Schwartzapfel of iSpot.tv and Traci Will of Gamut talked about dealing with the challenges of local OTT attribution. While national brands have been a fixture on streaming platforms for a while now, smaller and local brands are just beginning to dip their toes into this space. Industry challenges include a lack of standardization of measurement, a national versus a local focus and careless measurement which can have confusing results.

Dynamic Addressable TV Advertising over the Customer Lifecycle

Rex Du (University of Texas at Austin) explained that targeting addressable TV ads over the customer lifecycle enables the brand to benefit from long-term customer profitability. This methodology provides stronger outcomes than targeting to maximize same-day incremental conversion.

Leveraging Look Alike Models when A/B Testing isn’t an Option

It isn’t always possible to perform A/B tests when it comes to evaluating the impact of paid media campaigns. Caroline Iurillo and Megan Lau of Microsoft outlined the company’s development of a strategy which matches campaign exposure data with a customer database and then creates “look-alikes“ for non-exposed customers to make audiences as comparable as possible. Lifts in perceptions, behaviors and revenue can then be compared (in aggregate) amongst exposed customers and their non-exposed look-alikes to determine the effectiveness of a campaign.

Navigating Through Uncertainty with Next-Gen Marketing Mix

Greg Dolan (Keen Decision Systems) and Mark Bennet (Johnsonville Milwaukee) examined how to navigate in uncertain and volatile times in the current marketplace using next-generation marketing mix solutions. In the opening, Greg explored the progression of marketing from the late 90s, through what he dubbed “The Roaring 20s.” He noted that we went from a minimally complex, slower-paced “top-down” approach in the 1990s to a very fast-paced, complex environment with a shift to Retail Media and a unified approach where we apply next-generation predictive analytics in the 2020s. Greg discussed the intricacies of their approach of combining historical data with predictive/prescriptive plans to address drastic changes in the current environment, leveraging ML. He provided a case study that demonstrated the successful application of the next-generation marketing mix. In addition, Mark gave a client perspective on how they are handling market uncertainty.

Harnessing the Full Potential of Marketing Mix Models: How Attention, Creative and Audience Personalization can Drive ROI

Sameer Kothari (PepsiCo) and Todd Kirk (Middlegame Marketing Sciences) examined the application of a transformed rendition of marketing mix modeling created through the development of a proprietary system called the “ROI Engine.” Sameer indicated the desire to harness the “true potential of marketing mix models even beyond measuring past campaigns and using it for strategic planning looking forward.” Sameer discussed this system as having a more “predictive ROI outcome-based approach” by “leveraging an ecosystem of leading indicators for before and during a campaign flight.”

Rebuilding MMM to Handle Fragmented Data: The Challenge of Retailer Media

Liz Riley (OLLY) and Mark Garratt (In4mation Insights) explored rebuilding and reimagining marketing mixed modeling (MMM) to better handle fragmented data, in the era of retail media networks. Mark lauded MMM as an effective technique that has contributed to financial success for many businesses. In light of data becoming increasingly fragmented, he suggested that “some reinvention of the fundamental model framework is going to be required in order to move this old venerable method into the future.” Mark and Liz examined the Bayesian approach to MMM in handling fragmented data. Mark noted that there will not be a situation “where all the data is the same granularity in one place at one time.” The Bayesian approach can “fill in the blanks” of missing or fragmented data using reasonable estimates, creating a more accurate picture, which traditional MMM falls short of in the retail media environment.