attention

Putting Cinema in the Frame

Mike FollettCEO, Lumen Research

Manu SinghSVP, Insights/Analytics & Sales Data Strategy, National CineMedia (NCM)



Lumen has a comprehensive attention dataset for most media about visual attention, yet cinema was missing. How much attention goes to cinema advertising? The comparison between media is important. Clients have been asking to complete the picture—how to “put cinema in the frame”? Methodology: 151 respondents across six screenings; infrared head tracking cameras used to assess viewing. Track when people were looking at the screen and when not. The first ever eye tracking data for cinema in the U.S. Analysis of data was second by second. The attention funnel: 1,000 ads, 51% were technically viewable, only 9% were looked at, and if so, they were looked at for 1.6 seconds. Findings:
  • In contrast to TV or other media, and in particular digital display, cinemas ads are functionally unmissable. Almost all cinema ads are viewed.
  • Based on the premise that more attention is usually better—the longer you engage the more likely you are to remember them—findings show that cinema ads hold attention for dramatically more time than any other media—as much as 36x of some digital formats.
  • The proportion of the ads that gets looked all the way through—cinema ads are often longer than other formats, but they outperform other media as a proportion of viewable time.
  • Cinema is both audio and visual—perhaps even if you’re not looking at the screen, you might still be listening to it.
  • Cinema drive more attentive seconds per 1,000 impressions than any other media.
Attention and recall 1:1 correlation via survey: for every percentage point increase in attention there is correspondence lift in brand recall and brand choice. Implications for advertising: APM X CPM = ACPM ACPM embedded within agency planning tools. ACPM + BLS = “value of attention.”

Key Takeaways

  • The first ever eye tracking data for cinema ads in the U.S.
  • Ads in cinema are functionally unmissable; hold attention for dramatically more time than any other media; cinema ads are often longer than other formats but they outperform other media as a proportion of viewable time; Cinema drive more “attentive seconds per 1,000 impressions” than any other media.
  • Three main drivers of attention: 1) size of screen; 2) lack of distraction; 3) mood of audience.

Download Presentation

Member Only Access

Engaging the Next Generation: Challenges and Opportunities in Marketing to Gen Z

Aarti BhaskaranGlobal Head of Ad Research & Insights, Snap Inc.

Kara LouisGroup Research Manager, Snap Inc.



With 90% of Gen Z using their platform, Snap’s Aarti Bhaskaran and Kara Louis shared six top trends gathered over three years of studying this cohort from a combination of consumer insights and media measurement. As the most diverse generation in the U.S., Gen Z values authenticity and looks for personal online experiences in ways that inform how they engage with content. The data presented showed how Gen Z are visual communicators and mobile video natives. They trust real content from friends and family, pay attention early in ads, care about purpose messaging and want immersive shopping experiences, ideally personalized with AR. The Q&A after the presentation covered other aspects of the study including global differences, ad lengths, content creators and costs, and Gen Z-favored brands.

Brands looking to connect with Gen Z should:

  • Adapt to visual communication: 95% of Gen Z have used visual communication when messaging friends, and 54% of Gen Zs agree that digital avatars/Bitmojis help them express themselves.
  • Leverage mobile video: For Gen Z, mobile is a complement to TV with 1 in 2 (54%) like watching shorter shows or bite-size highlights of TV shows.
  • Ensure real, brand-safe content: 89% say it’s important to watch videos in a place that feels like a trusted and safe space.
  • Use immersive experiences: Interactive and personalized shopping experiences are a must—92% of Gen Z are interested in using AR for shopping, with over half of Gen Z saying they’d be more likely to pay attention to an ad that uses AR.
  • Capture their attention early: Despite having the lowest overall active attention across generations, Gen Z has the highest active attention in the first 8 seconds.
  • Feature purpose-driven messaging: 73% of Gen Z are likely to be loyal to a company that speaks to social issues, posts information or has ads about social change.

Download Presentation

Member Only Access

Beyond Measurement: How Coca-Cola Uses Attention Metrics to Increase Efficiency & ROAS

Marc GuldimannFounder & CEO, Adelaide

Greg PharoSr. Global Director, Holistic Communications & Marketing Effectiveness, The Coca-Cola Company

This presentation discussed cooperation between Coca-Cola and Adelaide. Adelaide created a metric AU—a single metric for brands to ensure their media gets the most attention. They conducted a campaign with Diet Coke to understand how attention metrics could be incorporated into a campaign measurement system to improve efficiency and to understand what could drive success. Coca-Cola has an end-to-end framework for measuring their campaigns. The E2E metrics has key components that measure human impact: head metric—all different measures if an experience was noticed and recalled; heart—resonance and relevance; hand—purchase/shopping; mouth—consumption. AU is part of the “head” component. An earlier trial in Europe took two campaigns—one with Aquarius and one with Coke. Half of media was optimized based on attention and the other on exposure. Clear convincing results showed that there is higher ad recall, recognition and impact based on attention as opposed to viewability. The Diet Coke campaign focused on the following questions: Can attention metrics offer insights into media? What level of attention is needed to increase brand lift? How can we gain real time insights? Can we leverage attention metrics to reduce ad waste? Methodology—3 stages: 1. A/B test: The campaign was split into two groups, AU-optimized and BAU optimized to VCR and CTR. Findings show consistent results when optimizing attention. Half to viewability and half to attention. 2. Max AU analysis: This considers the single highest AU impression for a respondent to control for frequency. It uses actual response data to gauge lift. This suggests the level of attention at which single impressions are impactful. Findings show exposure to media above 35 AU resulted in higher ad recall, purchase intent and favorability among consumers. 3. AU flight control: This considers the relationship between the frequency of exposures and Lucid survey results at different levels of media quality. Suggests the AU above which media is cumulatively impactful. They conducted regression analysis to find the minimum AU to use to drive consistent outcomes. The correlation at given frequency between ad exposures and purchase intent increases above 20 AU. For the Diet Coke campaign, optimal AU increased above 20 AU and is strongest above 29 AU, peaking at 38 AU. There is an opportunity to drive incremental lift: 1. Exposure to high AU media drives brand lift indicating AU is a proxy for Coca-Cola’s KPIs. 2. Identifying the minimum level of AU required for a KPI uncovers significant efficiencies. 3. Attention metrics provide a real-time window into brand performance. Next steps: How to measure AU everywhere; To explore leveraging high AU PMPs to provide targeting opportunities.

Key Takeaways

  • Exposure to high AU (Adelaide’s Attention Metric) drives brand lift indicating AU is a proxy for Cola’s KPIs.
  • Identifying the minimum level of AU required for a KPI uncovers significant efficiencies.
  • Attention metrics provide a real-time window into brand performance.

Download Presentation

Member Only Access

Let’s Face It: Facial Coding Isn’t Up to the Task

Elise Temple, Ph.D.VP, Neuroscience & Client Service, Nielsen IQ



There are limitations in terms of measuring the emotional impact of video. Facial expressions are powerful. Neuroscience has proved that special parts of our brain is dedicated to faces. Development psychology also demonstrates the power of facial expressions. Consumer neuroscience also demonstrates the importance of faces. The question is whether Facial Action Coding System (FACS) is the best tool to measure consumers emotional response when viewing ads? Nielsen IQ found that when you measure people’s facial expressions you need high quality videos, good lighting, algorithms. Nielsen IQ tests across over 2,000 ads in 15 countries. Initial R&D, global deployment, became a standardized diagnostic integrated with eye tracking. First phase found that facial expressions are only reproducible in strong and consistent moments. Happy is reproducible if it is really big, namely includes strong smiles. Negative emotions—reliability and reproducibility were as low as .25%. People watching TV, however, are mostly neutral. Reproducible expressions don’t occur often. Brain measure EEG, however, fluctuates much more and is highly dynamic. Put differently, there is a gap between facial expressions and EEG waves. Emotional expression does not equal emotion in the brain r=0.07. Is FACS predictive of anything meaningful? NO! No significant relation between ad-driven sales lift and any emotional expression. Facial coding is not the right tool for the job of emotional response to video ads. It is not reproducible—when measured continuously, don’t get same answer twice; it is not sensitive—at reproducible levels, expressions are rare; it is not meaningful—expressions do not equal emotion and are not reflective of dynamic emotion in the brain; and not predictive—smiles do not equal sales and don’t correlate with outcomes.

Key Takeaways

  • Facial coding is not a good measure of measuring emotional response to video ads.
  • Only smiles were found to register in somewhat reliable manner with facial coding.
  • There is no significant relation between ad-driven sales lift and any emotional expression.

Download Presentation

Member Only Access

Using Attention AI to Predict Real-World Outcomes

Max KalehoffVP Marketing Growth, Realeyes

Johanna WelchGlobal Mars Horizon Comms Lab Senior Manager, Mars



Mars and Realeyes prove connection between creative attention and sales performance. Mars’ Agile Creative Expertise (ACE) tool tracks visual attention and emotion responses to digital video ads. Visual attention AI and facial coding measures how long participants watch a video and how their attention changes as they watch. Proven this model to work—optimizing content, lifting sales up to 18% in 19 markets, $30 million in ad optimizations in 18 months. ACE can 1. Predict sales accurately while learning how consumers behave and think. 2. Optimize—improve performance through creative selection. 3. Scale—establish a fast scalable solution. The model links attention, emotion and memory. Accordingly: 1. Attention is access to the brain and enables the brand to enter into consciousness. 2. Facial reactions—build memories. 3. Impact—higher consideration, conversions and sales. ACE solution: 1. Participant exposure: 24-48 completion, 150-500 viewers from pool of +200m people. Observe people. 2. Attention detection: deep learning, collect viewer attention through natural viewer experience. 3. Actionable scores: ML and AI analytics to assess performance and deliver scores. Company impact: validated predictions proved connections to behavioral and sales data via over 4,000 sales/ads data points and benchmarks. They also used ACE to improve performances for TikTok, Facebook, Instagram, YouTube, achieving 18% cumulative sales lift. Global scale—scored over 1,000 creatives in 18 months. In conclusion, ACE is the biggest attention database, received U.S. patent for visual attention detection. Mars hopes to share ACE with other companies. And the next step is how to take a pre-testing tool to in-flight content and to examine brand equity.

Key Takeaways

  • Establishing the connection between creative attention and sales performance is key.
  • Mars’ Agile Creative Expertise (ACE) tool tracks visual attention as well as emotional responses to digital video ads.
  • Proven this model to work—optimizing content, lifting sales.

Download Presentation

Member Only Access

In-Home Psychophysiological Television Measurement

Pedro AlemidaCEO, MindProber

Lauren ZweiflerSVP Insights & Research, Client Strategy & Insights Org, NBCU



How to measure attention? This presentation introduced MindProber—a tool that includes biometrics, physiology, conscious/dial AND survey methods to assess attention. MindProber measurement is passive—second by second emotional engagement is measured through electrodermal activity (EDA), aka galvanic skin response (GSR) and also active—measures cognitive response via an optional feature to indicate like/dislike content through app. N= 1,500 and growing to 3,000 by end of year. Pedro Almeida (MindProber) emphasized three indicators of the value of metrics: 1. Validity—measures what it is supposed to measure. GSR is linearly related to perceived arousal in video formats. Measures emotional intensity (example of soccer game and spikes during key moments such as goals). 2. Reliability—metrics are stable across samples with 98.3% accuracy in ranking high vs. low impact ads. 3. Predictive power—metrics predict events of interest. Carry-over—if you’re more involved with the content you will respond more to ads. Premium content leads to higher advertising impact. Engagement with contents will carry over to engagement with ads. Emotional impact is indicator of likelihood of remembering the brand. Generating macro insights through a robust taxonomy—analyzed 150+ sessions, watched 250+ hours of programming, monitored 20,000+ individual hours, 10,000+ individual participations, 22,700+ events tagged and 8,500+ ads. Ninety-eight percent of commercial activity is as engaging as the content. Findings show 1. validity exists. 2. Reliability—98.3% accuracy in ranking high vs. low impact ads. 3. Predictive validity: A) Higher emotional impact for content = higher emotional impact for advertising. Carry-over—premium content leads to higher advertising impact; B) Higher emotional impact score = higher brand recall. Emotional impact is an indicator of likelihood of remembering the brand. Lauren Zweifler (NBCU) showed that MindProber successfully identified the most crucial moments on both scripted and unscripted shows (Iconic holiday TV moments for the first and Top Chef and America’s Next Top Model for the latter) WITH validity, reliability and predictability. What’s next?
  • How can we best predict lower funnel outcomes?
  • What elements of creative really matter?
  • How do we optimize for CTV?

Key Takeaways

  • New tool that captures emotional engagement through skin response.
  • Three fundamental metrics are validity, reliability and predictability.
  • Generated macro insights through a robust taxonomy. Power of premium video content yield strong emotional engagement (EE).

Download Presentation

Member Only Access

Why Visual Attention is a Flawed Measure of Attention

Duane Varan, Ph.D.CEO, MediaScience



Because of the complexity of attention, Duane Varan proposes to focus on inattention and regard attention as absence of inattention. Attention is the threshold above which the cognitive processing of stimuli occurs. It is not linear; it’s best understood as occurring above a threshold (inattention) after which other variables can deliver their value. There are many different kinds of attention, and the different measures capture the different types of attention. Inattention, on the other hand, is a constant construct. Phase 1 of the study focused on literature-proven measures (except fMRI) and found the following to be the four best methods: Heart rate (HR), visual fixations, blink duration, EEG (Alpha Wave). MediaScience and the Ehrenberg-Bass Institute concluded that eyes-on-screen is not best-in-class because 1. People are often paying attention even if not looking at screen; 2. Even when people look at the screen, they aren’t always paying attention, at least with fixations you see movements. Phase 2 used ads as stimuli and found two measures worked best: heart rate and EEG. Duane Varan pointed out that as an industry we have been fixating on eye tracking. But many ways of measuring eye tracking are not accurate AND they lose accuracy over time. There is a misalignment between what we are measuring (on which platforms) and what this means for different platforms. Even if accurate, eye tracking measures are not the best-in-class measure for attention. While EEG is a great measure, it is not scalable. Heart rate, on the other hand, can scale. Pilot project—massive in scale—50 programs, n>4000 using Stream Pulse platform; 1,576 in lab sessions and 2,437 in home sessions. Results demonstrate that HR via fitness trackers is viable with correlations as high as .81, but overall, this was only .32. Importantly, while promising, HR is a complex measure that necessitates more work to clean noise and harness full potential. Next steps: 1. Continue mining existing data to devise strategies for identifying and cleaning noise factors. 2. Large pilot “in-the wild” using natural fitness tracker data combined with ACR. Open questions: Where is attention being oriented to? Towards brand? If not, what’s the value in that? Then, thinking about what effect this has. Finally, not all measures lend themselves to AI. Trained against what reference data—is it measuring attention or something else, this depends on the reference data. A lot of the companies are using sales lift measures—you need attention to get sales lift, but you don’t measure this as attention. Finally, with what validation?

Key Takeaways

  • Inattention should be the focus.
  • Eye tracking is not the best measure; heart rate and EEG are better but only heart rate is scalable.
  • Heart rate is viable and scales so there is promise in its application.
  • Be skeptical (but hopeful) of AI measure of attention. Proper scientific validation is necessary before we adopt them.

Download Presentation

Member Only Access

Decode Digital Video Attention by Environments in the Wild

Bill HarveyExecutive Chairman, Bill Harvey Consulting, Inc.

Sophie MacIntyre Ads Research Lead, Meta



This research has two main objectives: to establish if there is evidence for distinct environment types within digital media and to understand implications for use of attention metrics. The study focused on mobile “in the wild” test using simulated media environments conducted and analyzed by Realeyes in partnership with Eye Square and Bill Harvey consulting. The study scoped to video ads on six media platforms—Meta (Facebook and Instagram), Hulu, Snapchat, TikTok, Twitter and YouTube—and three environment categories—Feed (Facebook, Instagram, twitter), Short Form (Facebook, Instagram and Snapchat stories, TikTok Takeover, Feed and TopView, and YouTube shorts) and Stream (Facebook InStream, Hulu Pre and Mid Roll, and YouTube Skippable and Non-Skippable). Study was conducted in three stages: pre-exposure survey followed by in-context view (ad visibility, skips scrolls, on-screen attention and reactions) and then post-exposure survey (examining brand recognition, ad recall, brand trust, ad liking and persuasion). A number of constants were held: 1) isolating effect of creative; 2) holding audience constant through randomization and isolation (each person sees only one ad at a time). Main findings include:
  1. Across digital environments relaxation is the key mode.
  2. While attention norms vary between environments, brand recognition is comparable and delivers the same effect.
  3. There were similar results across the funnel across different format types—apart from ad recall.
  4. The effect of attention on brand outcomes differs across environments. More attention = higher ad recall and better brand recognition. Feed and short form environments saw the same ad recall with shorter average time and fewer long exposures.
  5. Marketers should tailor creative to take advantage of different second by second attentive profiles. In stream there is a more constant level of attention across creative—more time to tell the story. In Feed and Short Form—a conscious decision to concentrate your attention—you’re enjoying the fun of choosing; in stream it’s like being in the back seat of the limo. Less choice. So different types of attention.
  6. Across all environments, consumer attentive behavior decreases with increasing familiarity.
  7. When you do a study like this it is important to filter out those people who aren’t typically users of the platform. Newbies don’t know how to escape the ads.
In conclusion, effectiveness of attention varies across environment. Caution is needed in averaging across categories. Attentive behavior evolves across time. Open questions:
  • How can we distinguish between positive and negative attention? Perhaps through facial attention measurement.
  • How to understand forced vs. earned time?
  • What are the tradeoffs between attention, brand outcomes and cost?
  • How can we layer on emotion data and/or additional data?

Key Takeaways

  • There are different profiles of consumer behavior across environments.
  • Despite this, brand outcomes are comparable, suggesting each environment attains value in a different way.
  • Attention has a different relationship with outcomes across environments and users.

Download Presentation

Member Only Access

Meaningful Attention – The Complexity of Human Perception

Jeff BanderChief Revenue Officer USA, eye square

Matthias Rothensee, Ph.D. Chief Scientific Officer & Partner, eye square

  Attention is a hot topic, but what is most important, according to these researchers, is to identify “meaningful” attention. Meaningful attention is focused, purposeful, deeply processed, and effective. Eye Square’s research on attention focuses on attention hierarchy (motion and novelty), ambiguity that attracts both brain and eye; and the pop-out effect. To measure attention, one needs:
  1. Decent methodology of measuring attention through eye tracking—decent accurate noise-free signal.
  2. Authentic media exposition—live in context portfolio.
Major results (through meta-analysis):
  • Attention span is short. Only 10% of all advertising contacts are longer than 8 seconds.
  • Relationship between attention and recognition is not linear: the first 2.5 count is where the greatest effect is created.
  • Longer viewing time does not always lead to more memory. Instagram vs. YouTube study shows that Instagram—despite lower attention has higher recognition. Perhaps this is related to exposure situation—voluntary attention on Instagram vs. forced exposure on YouTube.
  • There are different qualities of attention and these in turn impact/shape behavioral effects.
  • There are cultural, gender, geographic, age, race and ethnicity differences that guide attention.

Key Takeaways

  • Begin with the end in mind.
  • Focus on goal of attention.
  • Tie meaning to goal.

Download Presentation

Member Only Access

Beyond Reach, the Importance of Measuring Ad Resonance

Tom WeissChief Data Scientist, MarketCast

Megan Daniels SVP of Product, MarketCast

Tom Weiss and Megan Daniels of MarketCast introduced a new metric in their break-out session: brand effect resonance. This product evolved from one called Brand Effect which uses a combination of survey (15,000 consumers per day) and behavioral data across linear, social, digital (popular websites) and streaming. First developed by IAG, then owned by Nielsen and then Phoenix, Brand Effect stands as the main engine of the brand-effect resonance rating system, which was created to overcome gaps in reach measurement. They believe it can now isolate and show exactly how content and platform quality impacts advertising performance. Resonance here is defined as how well people remember the ad, how well they understood the creative and the message and how well they can link it back to the brand. Ad resonance measurement is said to be able to isolate the impact of content and platform on ad recall.

Key Takeaways

  • Reach measurement has drawbacks: even though it acknowledges audience size, it does not measure ad impact. Reach measurement treats all impressions as equal regardless of the content, and while platform and the quality of the content matter, their impact cannot currently be proven.
  • Ten percent of respondents to a recent MarketCast survey found that in a digital environment, ads associated with premium content (professionally produced vs. user-generated content (UGC)) had more credible messaging.
  • Sixty-two percent who watched a premium clip remembered the ad and the brand correctly versus 49% of those who viewed a UGC clip. In addition, 56% of premium clip viewers thought the ad spoke directly to them, versus 43% who said an ad in a UGC clip spoke to them.
  • The resonance score is built from CTV/streaming ACR data, opportunity to see (OTS) surveys for linear TV, tags on digital ads (known exposure) and on social—OTS through memory triggers and surveys.
  • Using the always on approach, they survey people about ads they have seen on TV, CTV or social within a 24-hour window. The big pool of participants is used to normalize all other factors, so to see what network or platform would best suit a client’s ad.
  • Ad resonance rating is meant to become an interoperable metric that complements traditional reach and frequency measures and other currency metrics.

Download Presentation

Member Only Access