attention

The ARF Attention Measurement Validation Initiative: Phase 1 Report Updated

  • ARF ORIGINAL RESEARCH

Attention metrics have drawn a high degree of energy in the last few years, for many reasons, including the loss of behavioral signals due to privacy restrictions, growing frustration with ad viewability and its perceived limitations, attention metrics’ impact on the cross-platform measurement debate and that biometric technologies can now be applied “in the wild,” rather than just in labs. The ARF’s Attention Measurement Validation Initiative aims to describe the attention measurement space in detail, illuminating this nascent sector. The Phase One findings include a comprehensive literature review and a report that maps out the vendor landscape in this increasingly diverse specialty. The report includes two sections. The first section describes what methods are being used, what these companies report and how and what they measure, be it ad creative or the media environment. The second section includes in-depth overviews of the 29 participating attention measurement companies. The Phase One Report is a must-read for anyone interested in attention metrics or what companies are operating in the space.  

Member Only Access

Augmented Reality – Unlock New Technology to Drive Brand Growth

Aarti BhaskaranGlobal Head of Research & Insights, Snap

Kara LouisGroup Research Manager, Snap

Aarti Bhaskaran and Kara Louis of Snap presented their amalgamation of work on augmented reality (AR) with key data and client case studies from the last two years. Showcasing the growth of the AR landscape, Aarti and Kara featured how consumers are gravitating towards AR and the expanding number of opportunities available for advertisers in reaching new audiences and utilizing within the media mix. Case studies include brands using AR try-on technology from Champs Sports and Clearly eyeglasses. Key takeaways:
  • AR usage is widespread and growing, from Boomers to GenZ. By the year 2025 there will be approximately 4.3 billion AR users across all generations.
  • Almost all marketers (91%) think consumers use AR for fun, but 67% of consumers prefer using AR for shopping over fun (53%).
  • Interacting with products that have AR experiences leads to a 94% higher purchase conversion rate, as individuals can better assess them and feel connected with brands. Certain AR applications can substitute physical shopping with different features varying across the customer journey.
  • Interactive and personalized shopping experiences reach Gen Z—92% are interested in using AR for shopping, with over half of Gen Z saying they’d be more likely to pay attention to ads using AR. Gen Zs are also twice as likely to buy items that they have experienced first using AR than those who don’t.
  • AR lenses on Snapchat outperformed all other media formats. Other platforms would need 14-20 ads to generate the same level of attention as Snapchat lenses.
  • AR not only drives short-term impact with higher purchase intent and brand preference, but it also improves brand opinion, influences implicit associations and increases likelihood to purchase and recommend.
  • The creative attributes that include logo and product branding, complexity, messaging and user experience show a significant relationship with AR performance in brand lift.

Download Presentation

Member Only Access

Determining the Value of Emotional Engagement to TV

Pedro AlmeidaCEO, MediaProbe

Context matters—not all reach is equal, and so, we need a way to qualify each impression and valuate each of these impressions. Metric of valuation needs to be valid, reliable and have predictive power for business outcomes. The research focus: 1) What can we say about the value of emotional engagement (EE)? 2) Can we model the value of EE via its impact on memory? 3) Can we use EE to optimize and valuate content and ad positions? How? Methodology: MediaProbe used Galvanic Skin Response with participants who were exposed to content through a MediaProbe panel (U.S., 2,700 households). Data gets delivered second by second and data extracted goes toward creating an impact measure of how much people are reacting to what they are watching. The platform calculates an impact value that enables comparisons across media platforms. There was an added layer to see whether participants are leaning into the content and are engaged. U.S. TV dataset includes over 45,000 participants, reaching over 85,000 hours. More than 1,000 TV hours are monitored and over 42,500 ads. Using a subset of 16,351 ads and 329 “premium pod” formats, participants watch content and are then asked which ads they remember. Findings:
  1. Enhancing the emotional impact of an ad in 150 EIS points equates to adding a second 30’ ad unit. This will increase probability of brand recall by 15%. For each 100 points, this increases probability of brand recall by 10%.
  2. Single best predictor of whether someone will respond to an ad is how much a person was engaged with the content prior to the ad. EE carries over to the ad break. It’s more engaging pre-break, in earlier breaks and earlier position in break, which leads to higher ad impact.
  3. However, this is different across genres. Genre moderates pre-break emotional patterns. This is further differentiated within genres. For instance, people will react differently to ad breaks when watching soccer vs. some other sport. MediaProbe shows that there is 66% similarity between various award shows in terms of EE to ad breaks. They use this data to realize the value of different ads placed in different breaks (1st, 2nd, etc. break) and pods. Emotional engagement helps better predict ads performance.
  4. Additional findings show that first-in-break still rules and that premium pods deliver higher recall.
Key takeaways:
  • Ad EIS is systematically associated with ad recall.
  • It is possible to optimize ads for estimated impact by advertising in the most engaging content and being present after the most engaging moments.
  • Different genres tend to have typical pre-break engagement morphologies. This allows to estimate the delivered value of each pod position (and order in break when relevant).

Download Presentation

Member Only Access

Aligning with Rituals: The Contextual Foundation of Audio

Prayushi AminAssociate Director, Magna Global

Idil CakimSVP, Research & Insights, Audacy

Audio is a daily ritual at the heart of the day. With the richness of audio experiences, should brands strive for contextual alignment? But what is contextual alignment? There are two types: Genre based—aligning with audio content genre that is contextually relevant to the brand; Ritual based—aligning with audio ritual/behavior that is contextually relevant to the brand. Methodology: a controlled test to quantify the impact of genre and ritual-based contextual alignment; recruitment of weekly audio listeners from a representative online panel, listening to content that they chose for roughly 30 mins. Listeners then answered brand metric questions to determine ad effectiveness. Findings:
  1. Ads in context perform better. The brands feel more relevant.
  2. Audio with Rituals in context taps into purchase and genuine interest in the product.
  3. Listeners feel more connected to the brand when hearing contextually aligned ads.
  4. Listeners who felt energized or excited were more receptive to the ad. Audio during rituals get people motivated and more open to noticing ads.
Implications:
  1. Ensure contextual targeting is a part of your digital audio planning to drive transactional next steps.
  2. Explore rituals to reach a highly engaged audience and amplify the effectiveness of your audio.
Audacy came out with a campaign to promote the Audacy app across radio stations: four markets, 22 stations, 20 unique promos, during six weeks of media. Findings show that the rituals campaign worked—increases in app downloads are directly attributable to the rituals campaign. The campaign particularly influenced heavy radio listeners, parents, 35-54 and cross-platform listeners. Key takeaways:
  • Audio rituals works.
  • Audio rituals targeting works.
  • There is a way to further slice and be more precise with audio.

Download Presentation

Member Only Access

Measuring Attention and Outcomes for Audio Advertising

Mike FollettCEO, Lumen

Joanne LeongGlobal Head of Planning, Dentsu

Lumen and Dentsu measured attention in audio. Audio is obviously a key component, but the main challenge is how to create attention metrics for audio that can be comparable to visual? Can eye tracking be applied to audio, and if so how? Previous research shows that ads have to be noticed to drive results. Not necessarily looked at. There is a need for some form of attention to make ads work. Seventy percent of viewable ads are not viewed and as such do not sell. Research also shows that longer ads drive better outcomes in terms of prompted recall and choice uplift. Visual eye movements are a part of this but only the first part of the process that may lift to memory and action. At Lumen they measure 1) how many ads are viewable for the user; 2) whether they are viewable (=MRC); 3) % viewed; 4) view time in seconds; 5) APM in seconds; 6) cost per attentive impression. Eye tracking works by taking videos of eyes while on screen—simple behavioral metric. After this they ask questions and understand the relationship between eye movements and other measures. Audio works differently. We lack information about the percent of people who listened and average listening time, but we can infer from visual attention. This is, thus, an audio-visual equivalence. How much visual attention would generate same recall from audio? According to the presenters, inferential model seems to work quite well. They infer likely levels of audio attention from several factors: exposure time, brand recall, choice uplift, forced vs. voluntary. Methodology: They measured people listening to radio, podcasts and streaming audio services. There were three forms of audio advertisements, thousands of people from whom to collect audio and recall data and infer how much visual attention would have been needed to do the same. Finding: Attention metrics are equivalent for audio. This data is built into Dentsu’s planning tools when their trading teams are contemplating which media to buy. The research shows that audio generates attention at a lower cost. In a digital world, it is about measuring live campaigns, and planners and clients are used to getting impression-level data about viewability or audibility. Audio industry has the ability to supply this data. Individual data on podcasts and streaming could help demonstrate the true power of audio campaigns. Challenge to industry: now that the potential power has been demonstrated we need to get impression level data to be able to measure live campaigns. Key takeaways:
  • Radio is an extremely cost-effective way of reaching people and driving outcomes.
  • We have benchmarks, we want measurement, we need impression-level data.
  • Combine attention data with outcomes data to tell a compelling story.
 

Download Presentation

Member Only Access

OOH Measurement’s Game Has Changed

Christina RadiganSVP, Research & Insights, Outfront

Christina Radigan of Outfront explored the advantages of out-of-home advertising (OOH) and discussed advancements in its measurement techniques. Christina noted that with the loss of cookies and third-party data, contextual ad placement will see a renewed sense of importance, and in OOH, location is a proxy for context, driving content. She further indicated the benefits of OOH citing a recent study by Omnicom, using marketing mix modeling (MMM), which found that increased OOH spend drives revenue return on ad spend (RROAS). This research also highlighted that OOH is underfunded, representing only 4% to 5% of the total media marketplace. Following up on this, Christina pointed to attribution metrics, measuring the impact of OOH ad exposure on brand metrics and consumer behaviors, to demonstrate OOH's effectiveness at the campaign level. Expanding on their work in attribution, she noted changes stemming from the pandemic: Format proliferation and greater digitization, privacy-compliant mobile measurement ramping up (opt-in survey panel and SDK) and performance marketing and measurement becoming table stakes for budget allocations. New measurement opportunities from OOH intercepts included brand lift studies, footfall, website visitation, app download and app activity and tune in. Finally, she examined brand studies conducted for Nissan and Professional Bull Riders (PBR), showcasing the effectiveness of OOH advertising in driving recall, ticket sales and revenue. Key takeaways:
  • MMMs return to the forefront, as models become more campaign sensitive and are privacy compliant (powered by ML and AI).
  • A study from Omnicom, using MMM, found that optimizing OOH spend in automotive increased brand consideration (11%) and brand awareness (19%). In CPG food, optimizing OOH spend increased purchase intent (24%) and optimizing OOH spend in retail grocery increased awareness (9%).
  • OOH now represents a plethora of formats (e.g., roadside ads, rail and bus ads, digital and print) and has the ability to surround the consumer across their journey, providing the ability to measure up and down the funnel, in addition to fueling behavioral research.
  • Key factors for successful measurement in OOH: feasibility (e.g., scale and scope of the campaign, reach and frequency), the right KPIs (e.g., campaign goal) and creative best practices (Is the creative made for OOH?).
  • OOH advertising is yielding tangible outcomes by boosting consumer attention (+49%). Additionally, there has been a notable surge in advertiser engagement (+200%).
  • Ad recall rates in OOH continue to increase (e.g., 30% in 2020 vs. 44% in 2023).

Member Only Access

Neuro: TV Brand Attraction Advantage Over Digital

Bill HarveyExecutive Chairman, Bill Harvey Consulting

Elizabeth Johnson, Ph.D.Executive Director & Senior Fellow, Wharton Neuroscience Initiative, UPenn

Michael Platt, Ph.D.Director, Wharton Neuroscience Initiative, UPenn

Audrey SteeleEVP Sales Research Insights & Strategy, FOX Corp.

The presenters discussed their study focused on the link between attention and sales. Attention is required for engagement. Eyes on screen do not predict sales well. However, three main brain measurement dimensions account for sales and branding effects: brand attraction/joy (=motivational signals in fMRI and EEG), memory (=Theta power in EEG) and synchrony (=collective resonance across audience brains, in fMRI and EEG)—all require more than 1-2 seconds to unfold and measure. Using neuroanalysis can help unmask hidden thoughts and feelings (via fMRI). Additionally, scaled up, other predictive bio and neuro metrics can be just as predictive. The research shows that patterns of brain activity predict sales best: the sum of all perceptual, attentional, emotional, social and memory processes. We can also use EEG to tell us about frustration, attention, memory, sleep/introspection. Research using EEG shows that EEG measuring brand attraction/joy can predict 80% variance in sales. Notably, brand attraction/joy takes 15 seconds to peak. Brain memory also predicts sales. Notably, memory encoding picks up after 10 seconds. Finally, synchrony—collective audience response—predicts more than 90% of sales but also has temporal dynamics, peaks at 5 seconds and picks up again after 15-20 seconds. Wharton Neuroscience investigated predicting how different content and platform impact sales lift. The study design: eight ads in eight verticals tested in each of the 10 experimental cells (7 TV, 2 smart phones, 1 control condition). Four ads at a time are shown between TV show. Each viewer will only see one kind of content. Findings from the 3% of total sample: attraction and memory are sustained for ads shown in premium channels compared to YouTube. Value of context is enormous! YouTube has a drop at 4 seconds whereas TV continues. Key takeaways:
  • Attention is an incomplete measure by which to select media contexts and platforms for specific campaigns.
  • Premium longform content and contexts have more sales and branding impact than digital, especially in new customer growth due to emotional immersion in TV context vs. brevity of ad attention/engagement in digital.

Download Presentation

Member Only Access

The Power of Radio Through the Lenses of Emotional Engagement

Pedro AlmeidaCEO, MediaProbe

Pierre BouvardChief Insights Officer, Cumulus Media | Westwood One

The presentation focused on determining the emotional impact of AM/FM radio ads. MediaProbe was retained by Cumulus Media to measure second-by-second electrodermal activity (EDA)—a measure of the sympathetic nervous system, to see when it is activated, whether listeners were excited by the stimulus they heard. This is termed Emotional Impact Score (EIS)—an impact metric that can help understand how excited people are on a second-by-second basis and what are the elements that drive this emotion. This is an objective way of quantifying emotion in media and advertising content, capturing the emotional implicit data (what people feel). Throughout the session, participants can also dial those moments that they like/dislike—the conscious explicit capture of likes and dislikes, and are asked pre and post session questions to learn more about recall and purchase intent. Methodology: 36 AM/FM radio ads, in a simulated broadcast of 30 minutes across four genres (urban, news, adult contemporary and rock/oldies). Each “broadcast” had three ad breaks and the average commercial break had three ads. Also, 227 people participated. Each “broadcast” had a sample size of 75 people and consumers listened to at least three of the four broadcasts. Each ad was exposed to 225 people. Findings:
  1. AM/FM radio programming outperforms MediaProbe’s U.S. TV norms by 13%. Put differently, the emotional impact score is higher when listening to radio.
  2. Carry over effect: radio advertising commercial pods receive 12% higher Emotional Impact Score over TV advertising commercial pods, making radio a premium platform.
  3. Across genres, people are more engaged when listening to news—people are processing what is being said, they are paying attention. There is no valence contamination between what is being said on the news and the emotional engagement to ads.
  4. People are more engaged during radio advertising—4% more than radio content.
  5. Looking at 32 individual MediaProbe ads, there is on average a 5% higher emotional impact score in comparison to 4,670 individual MediaProbe TV ads. This research is consistent with other lab-based studies.
  6. MediaProbe also conducted a physical feature analysis of the creative to find that: 1) higher pitch contrast between programming content and ads leads to higher impact. If the content has low pitch, ads should be higher pitch and vice-versa; 2) louder ads lead to higher impact.
  7. Using a regression analysis, MediaProbe found the following best performing creative in radio ads: 1) female voiceover; 2) with jingles/with background music; 3) five brand mentions are optimal; 4) no disclaimers. This too is consistent with other research.
Key takeaways:
  • AM/FM radio programming is more engaging than TV, according to MediaProbe
  • They also found that AM/FM radio advertising outperforms TV advertising.
  • News is the most impactful genre as a high-quality contextual environment for advertising.
  • Sound contrast between radio programming and ads drives higher attention and brand recall.
  • Creative best practices: female voiceover, jingles, one voiceover and five brand mentions.
 

Download Presentation

Member Only Access

How Co-viewing and Other Factors Impact Viewer Attention to CTV

Monica LongoriaHead of Marketing Insights, LG Ad Solutions

Tristan WebsterChief Product Officer, TVision

The research presented included an online survey of over 1,000 respondents incorporated with TVision’s 5,000+ U.S. home panel data. Questions asked: 1. Does CTV garner more attention? 2. Are consumers more likely to co-view CTV? 3. Does co-viewing negatively affect attention? TVision’s equipment includes their always-on panel, a webcam that can capture how many people are in the room and eyes on screen at a second by second, a router meter to understand which CTV device is on and detects apps. TVision measurement engine includes remote device management and ACR engine. Findings:
  1. CTV in general has 13% higher attention index. Attention increases due to purposeful watching. Co-viewing CTV has stronger impact in comparison to linear (75% higher).
  2. Streaming is a popular co-viewing experience with mostly a non-negative impact to attention. Households with kids are more likely to pay attention to streaming content and ads with 36% more likely to discuss what is seen on TV. There are three different types of co-viewing: family setup with different age group (increased attention depends on genre), adults only setup with similar gender and age (biggest impact on attention), mixed adults only setup.
  3. Streaming is gaining ground as a co-viewing method for watching sports. Watching sports is typically with other people.
Implications for brands and marketers:
  1. CTV offers opportunity to create more engaging ads with higher levels of attention. CTV has digital capabilities that garner more attention. There is a need to create ads that are specific for CTV (in contrast to linear).
  2. Co-viewing can be an opportunity to turn your brand into a discussion.
  3. Measurement providers give us new insights into viewer behavior.
Key takeaways:
  • There is a higher attention with CTV in comparison to linear.
  • Positive impact of co-viewing: Co-viewing on streaming platforms is popular and generally maintains or increases attention.
  • Streaming is increasingly preferred for watching sports in a co-viewing context, offering new opportunities for targeted advertising and engagement in sports content.
  • Implications for brands and advertisers: The engaging nature of CTV offers ample opportunities for more impactful ads. Co-viewing experiences can transform ads into discussion points among viewers, enhancing brand engagement.

Download Presentation

Member Only Access

Human Experience: Why Attention AI Needs Human Input

Dr. Matthias RothenseeCSO & Partner, eye square

Stefan SchoenherrVP Brand and Media & Partner, eye square

Speakers Matthias Rothensee and Stefan Schoenherr of eye square discussed the need for a human element and oversight of AI. Beginning their discussion on the state of attention and AI, Matthias acknowledge that race for attention is one of the defining challenges of our time for modern marketers. He quoted author Rex Briggs, who noted the "conundrum at the heart of AI: its greatest strength can also be its greatest weakness." Matthias indicated that AI is incredibly powerful in recognizing pattern from big data sets but at the same time there are some risks attached to it (e.g., finding spurious patterns, hallucinations, etc.). Stefan examined a case study using an advertisement for the candy M&Ms, which considered real humans using eye tracking technology and compared it to results using AI. The goal was to better understand where AI is good at predicting attention and where does it still have to optimize or get better. Results from a case study indicated areas for AI improvements in terms of gaze cueing, movement, contrast, complexity and nonhuman entities (e.g., a dog). The static nature of AI (e.g., AI prediction models are often built based on static attention databases) can become a challenge when comparing dynamic attention trends. Key takeaways:
  • Predictive AI is good at replicating human attention for basic face and eye images, high-contrast scenes (e.g., probability of looking at things that stand out) and slow-paced scene cuts where AI can detect details.
  • AI seems unaware of a common phenomenon called the "cueing effect" (e.g., humans not only pay attention to people's faces but also to where they're looking), which leads to an incorrect prediction.
  • AI has difficulties deciphering scenes with fast movements (e.g., AI shows inertia) in contrast to slow-paced scenes where AI excels in replicating human feedback. In this case human feedback is more accurate.
  • AI is more consumed with attention towards contrast (e.g., in an ad featuring a runner, AI gave attention to trees surrounding the runner), whereas humans can decipher the main aspect of an image.
  • AI decomposes human faces (e.g., AI is obsessed with human ears), whereas humans can detect the focal point of a human face. In addition, AI hallucinates, underestimating facial effects.
  • AI has difficulties interpreting more complex visual layouts (e.g., complex product pack shots are misinterpreted).
  • AI is human centric and does not focus well on nonhuman entities such as a dog (e.g., in scenes where a dog was present, AI disregarded the dog altogether).
  • AI tends to be more static in nature (e.g., AI prediction models are often built based on static attention databases), which can be a problem when comparing this to dynamic attention trends.

Download Presentation

Member Only Access