audience effects

Augmented Reality – Unlock New Technology to Drive Brand Growth

Aarti BhaskaranGlobal Head of Research & Insights, Snap

Kara LouisGroup Research Manager, Snap

Aarti Bhaskaran and Kara Louis of Snap presented their amalgamation of work on augmented reality (AR) with key data and client case studies from the last two years. Showcasing the growth of the AR landscape, Aarti and Kara featured how consumers are gravitating towards AR and the expanding number of opportunities available for advertisers in reaching new audiences and utilizing within the media mix. Case studies include brands using AR try-on technology from Champs Sports and Clearly eyeglasses. Key takeaways:
  • AR usage is widespread and growing, from Boomers to GenZ. By the year 2025 there will be approximately 4.3 billion AR users across all generations.
  • Almost all marketers (91%) think consumers use AR for fun, but 67% of consumers prefer using AR for shopping over fun (53%).
  • Interacting with products that have AR experiences leads to a 94% higher purchase conversion rate, as individuals can better assess them and feel connected with brands. Certain AR applications can substitute physical shopping with different features varying across the customer journey.
  • Interactive and personalized shopping experiences reach Gen Z—92% are interested in using AR for shopping, with over half of Gen Z saying they’d be more likely to pay attention to ads using AR. Gen Zs are also twice as likely to buy items that they have experienced first using AR than those who don’t.
  • AR lenses on Snapchat outperformed all other media formats. Other platforms would need 14-20 ads to generate the same level of attention as Snapchat lenses.
  • AR not only drives short-term impact with higher purchase intent and brand preference, but it also improves brand opinion, influences implicit associations and increases likelihood to purchase and recommend.
  • The creative attributes that include logo and product branding, complexity, messaging and user experience show a significant relationship with AR performance in brand lift.

Download Presentation

Member Only Access

Intent and Impact: A New Measurement for DEI

James Ambalathukal Director, Strategy & Insights, Magid

Mike Bloxham EVP, Global Media & Entertainment, Magid

Mike Bloxham and James Ambalathukal of Magid partnered with twelve networks and streaming services in a study to identify factors of cultural authenticity in drama, comedy and unscripted programs. With research into the creative elements that resonate with diverse populations from qualitative studies and online surveys, Mike and James described the importance of authenticity in how audiences relate to emotional content, how they see themselves in the content and ultimately, how they perceive the content itself. The various levels of signals that diverse audiences assess as good or bad representation include storytelling components and physical production elements, which help separate out what drives positive and negative perceptions of these shows for actionable results. Key takeaways:
  • There are different levels of expectations with different genres. Sitcoms and reality content without representation can connect to audiences if relatable character journeys and storylines are present. Projecting family and community values goes far. In dramas, applying specificity and non-verbal cultural details on the set or in a character, like authentic hair and wardrobe, even if not part of the narrative, is a driver of authentic representation. Other kinds of content like adult animated shows, news programming and sports are not viewed through a DEI lens.
  • Marginalized communities value representation but don’t want to be reduced to just the racial and ethnic parts of their identity.
  • Effective representation is strongly connected to perceptions of authenticity.
  • Authenticity isn’t just a preference; it has real impact on content engagement.
  • Story elements influencing perceptions of authenticity share similarities and differences across various cohorts.

Download Presentation

Member Only Access

Human Experience: Why Attention AI Needs Human Input

Dr. Matthias RothenseeCSO & Partner, eye square

Stefan SchoenherrVP Brand and Media & Partner, eye square

Speakers Matthias Rothensee and Stefan Schoenherr of eye square discussed the need for a human element and oversight of AI. Beginning their discussion on the state of attention and AI, Matthias acknowledge that race for attention is one of the defining challenges of our time for modern marketers. He quoted author Rex Briggs, who noted the "conundrum at the heart of AI: its greatest strength can also be its greatest weakness." Matthias indicated that AI is incredibly powerful in recognizing pattern from big data sets but at the same time there are some risks attached to it (e.g., finding spurious patterns, hallucinations, etc.). Stefan examined a case study using an advertisement for the candy M&Ms, which considered real humans using eye tracking technology and compared it to results using AI. The goal was to better understand where AI is good at predicting attention and where does it still have to optimize or get better. Results from a case study indicated areas for AI improvements in terms of gaze cueing, movement, contrast, complexity and nonhuman entities (e.g., a dog). The static nature of AI (e.g., AI prediction models are often built based on static attention databases) can become a challenge when comparing dynamic attention trends. Key takeaways:
  • Predictive AI is good at replicating human attention for basic face and eye images, high-contrast scenes (e.g., probability of looking at things that stand out) and slow-paced scene cuts where AI can detect details.
  • AI seems unaware of a common phenomenon called the "cueing effect" (e.g., humans not only pay attention to people's faces but also to where they're looking), which leads to an incorrect prediction.
  • AI has difficulties deciphering scenes with fast movements (e.g., AI shows inertia) in contrast to slow-paced scenes where AI excels in replicating human feedback. In this case human feedback is more accurate.
  • AI is more consumed with attention towards contrast (e.g., in an ad featuring a runner, AI gave attention to trees surrounding the runner), whereas humans can decipher the main aspect of an image.
  • AI decomposes human faces (e.g., AI is obsessed with human ears), whereas humans can detect the focal point of a human face. In addition, AI hallucinates, underestimating facial effects.
  • AI has difficulties interpreting more complex visual layouts (e.g., complex product pack shots are misinterpreted).
  • AI is human centric and does not focus well on nonhuman entities such as a dog (e.g., in scenes where a dog was present, AI disregarded the dog altogether).
  • AI tends to be more static in nature (e.g., AI prediction models are often built based on static attention databases), which can be a problem when comparing this to dynamic attention trends.

Download Presentation

Member Only Access

Health, Charity and Green Messaging Highlighted in JAR Prosocial-Themed Issue


A JAR 2022 call for papers on prosocial advertising generated a deluge of submissions aimed at helping advertisers use more effective means of communicating in ways that benefit society. The result is the newly published March issue focused on messaging strategies that help consumers make informed decisions on health, the environment and charity.

Member Only Access

Navigating the Evolving Media Landscape

  • OTT 2023

The media landscape continues to evolve, arguably at a faster rate than ever. Leading media and measurement experts presented research-based insights on how viewers use different forms of TV/video on various platforms. Attendees joined us at the Warner Bros. Discovery Studios in California and via livestream to understand the latest data and discussions of the data’s implications.

Member Only Access

Unlocking Reach in Premium Content

NBCU’s Mike Levin and Emily Kwok tested brand safety in premium video content from a viewer perspective in their research using NBCU’s proprietary AI tech for automating brand safety and suitability decision making. The study’s three objectives asked whether increasingly violent episodes influence viewers’ experiences, if they then assign blame to marketers for knowingly advertising in explicit or violent content, and if there are specific instances where adjacency affects viewer sentiment towards an ad.

Going Steady: How Long Will (My Cross-Media Campaign) Last?

In this session, Tania Yuki and Brian Pugh of Comscore explored the impact of frequency and latency in cross-platform advertising effectiveness. In her opening, Tania demonstrated consumer trends and touchpoints to better understand cross-media, in terms of reach and optimizing platforms for specific outcomes. In her discussion, Tania acknowledged the challenges of measurement due to the constant introduction of new innovations and the adoption of new behaviors to track. She also recognized the considerable increase in connected devices per household since the pandemic. Tania pointed out complexities in the current media ecosystem from the increase in which media has merged despite being separate platforms (e.g., linear TV, social media, online video, etc.). In addition to all the changing behavior in media consumption, the speaker noted the emergence of Generation Z is beginning to change the rules for establishing brand love and loyalty. In his discussion, Brian examined findings from the measurement of 400 cross-platform campaigns to understand trends in terms of platform mixes. Brian noted the continued growth of social media and CTV along with the decline in linear TV, though he acknowledged linear still remained “king.” Furthermore, he found that multi-screen campaigns performed better than single-platform campaigns.

Context Matters

Heather Coghill (WBD) and Daniel Bulgrin (MediaScience) shared methodologies and results from two in-lab studies that sought to understand how impactful category priming can be without brand mention and if viewers associate brands with adjacent unsuitable content.

Their presentation focused on two types of contextual effects within program context—“excitation transfer” and “brand priming”.