Engaging consumers on social media platforms is extremely advantageous for firms. Yet, sustaining engagement is not always easy. Usership often drops off. The field experiment described in this Marketing Science Institute (MSI) working paper shows helpful ways to sustain social media engagement and offers insights into the positive effects of offering certain members power in their online community (OC).
Brands increasingly use social media influencers as part of their marketing strategy, yet methods for measuring their effectiveness and for choosing optimal partners are unreliable. New research analyzing the key performance indicators (followers and engagement) of 180 influencers, across five social media platforms, identifies patterns that marketers can use toward establishing partnerships with the right influencers.
Aarti Bhaskaran – Global Head of Ad Research & Insights, Snap Inc.
Kara Louis – Group Research Manager, Snap Inc.
With 90% of Gen Z using their platform, Snap’s Aarti Bhaskaran and Kara Louis shared six top trends gathered over three years of studying this cohort from a combination of consumer insights and media measurement. As the most diverse generation in the U.S., Gen Z values authenticity and looks for personal online experiences in ways that inform how they engage with content.
The data presented showed how Gen Z are visual communicators and mobile video natives. They trust real content from friends and family, pay attention early in ads, care about purpose messaging and want immersive shopping experiences, ideally personalized with AR.
The Q&A after the presentation covered other aspects of the study including global differences, ad lengths, content creators and costs, and Gen Z-favored brands.
Brands looking to connect with Gen Z should:
Adapt to visual communication: 95% of Gen Z have used visual communication when messaging friends, and 54% of Gen Zs agree that digital avatars/Bitmojis help them express themselves.
Leverage mobile video: For Gen Z, mobile is a complement to TV with 1 in 2 (54%) like watching shorter shows or bite-size highlights of TV shows.
Ensure real, brand-safe content: 89% say it’s important to watch videos in a place that feels like a trusted and safe space.
Use immersive experiences: Interactive and personalized shopping experiences are a must—92% of Gen Z are interested in using AR for shopping, with over half of Gen Z saying they’d be more likely to pay attention to an ad that uses AR.
Capture their attention early: Despite having the lowest overall active attention across generations, Gen Z has the highest active attention in the first 8 seconds.
Feature purpose-driven messaging: 73% of Gen Z are likely to be loyal to a company that speaks to social issues, posts information or has ads about social change.
Johanna Welch – Global Mars Horizon Comms Lab Senior Manager, Mars
Mars and Realeyes prove connection between creative attention and sales performance. Mars’ Agile Creative Expertise (ACE) tool tracks visual attention and emotion responses to digital video ads. Visual attention AI and facial coding measures how long participants watch a video and how their attention changes as they watch. Proven this model to work—optimizing content, lifting sales up to 18% in 19 markets, $30 million in ad optimizations in 18 months.
ACE can 1. Predict sales accurately while learning how consumers behave and think. 2. Optimize—improve performance through creative selection. 3. Scale—establish a fast scalable solution. The model links attention, emotion and memory. Accordingly:
1. Attention is access to the brain and enables the brand to enter into consciousness.
2. Facial reactions—build memories.
3. Impact—higher consideration, conversions and sales.
1. Participant exposure: 24-48 completion, 150-500 viewers from pool of +200m people. Observe people.
2. Attention detection: deep learning, collect viewer attention through natural viewer experience.
3. Actionable scores: ML and AI analytics to assess performance and deliver scores.
Company impact: validated predictions proved connections to behavioral and sales data via over 4,000 sales/ads data points and benchmarks. They also used ACE to improve performances for TikTok, Facebook, Instagram, YouTube, achieving 18% cumulative sales lift. Global scale—scored over 1,000 creatives in 18 months. In conclusion, ACE is the biggest attention database, received U.S. patent for visual attention detection. Mars hopes to share ACE with other companies. And the next step is how to take a pre-testing tool to in-flight content and to examine brand equity.
Establishing the connection between creative attention and sales performance is key.
Mars’ Agile Creative Expertise (ACE) tool tracks visual attention as well as emotional responses to digital video ads.
Proven this model to work—optimizing content, lifting sales.
In this session, Tania Yuki and Brian Pugh of Comscore explored the impact of frequency and latency in cross-platform advertising effectiveness. In her opening, Tania demonstrated consumer trends and touchpoints to better understand cross-media, in terms of reach and optimizing platforms for specific outcomes. In her discussion, Tania acknowledged the challenges of measurement due to the constant introduction of new innovations and the adoption of new behaviors to track. She also recognized the considerable increase in connected devices per household since the pandemic. Tania pointed out complexities in the current media ecosystem from the increase in which media has merged despite being separate platforms (e.g., linear TV, social media, online video, etc.). In addition to all the changing behavior in media consumption, the speaker noted the emergence of Generation Z is beginning to change the rules for establishing brand love and loyalty. In his discussion, Brian examined findings from the measurement of 400 cross-platform campaigns to understand trends in terms of platform mixes. Brian noted the continued growth of social media and CTV along with the decline in linear TV, though he acknowledged linear still remained "king." Furthermore, he found that multi-screen campaigns performed better than single-platform campaigns.
The number of connected devices per household has increased from 9 to 12 since the pandemic, creating a more complex path in which to reach consumers.
Despite being separate platforms (e.g., linear TV, social media, online video, etc.) media is “inextricably commingled together,” leading to "context switching and about getting the right content to the right consumer."
In terms of long-form video, "Linear television is still the juggernaut in the room at 205 billion [viewing] hours." Total video across linear, CTV and digital grew 5% year-over-year in the U.S. CTV viewing increased by 14% of the total hours watched.
Short-form video continues to rise in popularity through Instagram Reels, TikTok and YouTube Shorts. This trend in short-form video consumption is growing in double-digit percentages and redefining video consumption across mobile and connected TV screens.
The emergence of Generation Z is changing the marketer approach to brand love and establishing loyalty and building long-term value as their consumer behavior is in contrast to previous cohorts. This is specific to their lack of brand loyalty.
In terms of media consumption, Generation Z are heavy movie watchers (37%), preferring dramas (29%) and cooking shows (23%). Additionally, they expressed interest in local news and documentaries.
Social media is still growing (11%) but there are fewer linear TV households (-9%) as people are consuming media elsewhere and CTV has increased substantially (32%).
Though there was a clear decline in linear TV viewership, linear TV remains supreme regarding total viewership for one channel.
In terms of incremental reach over the length of a campaign, linear TV reached a lot of viewers in the early part of a campaign, but over time the study indicated "reaching incremental people on CTV and digital more often." This finding acknowledged the advantages of a cross-screen campaign in terms of optimizing reach.
Adding screens in a campaign improved brand lift but the variability of results also increased. Additionally, results for ad recall and other variables followed a similar pattern. It was noted that the optimal platform mix depended on the target audience.
Yannis Pavlidis – VP, Data Science and Analytics, DISQO
In this session, Yannis Pavlidis of consumer insights and CX firm DISQO tackled the challenges of benchmarking, cross-media outcomes and brand lift due to incomplete data from siloed platforms and media channels. In the opening, Yannis provided a refresher on the importance of benchmarks and obstacles from existing approaches to benchmarking (e.g., inconsistent methodologies, outdated data and collection techniques). The discussion examined solutions to address issues in data collection concerning benchmarking ad impact, which streamlines the process using consented, single-source data. The presentation also examined calculating benchmarks based on data taken from one source group (rather than two unaffiliated groups), considered the recency of the campaign used and subsequent behavior(s) which then can be correlated with survey responses. The advantage of using consented single-source data is that it can lead to more insightful, relevant and consistent outcomes in benchmarks.
Challenges with existing approaches to benchmarks included the following:
Inconsistent methodologies across social networks make data comparison difficult when assessing cross-media campaigns.
Behavioral data is often aggregated from more than one source, making data triangulation inefficient and unreliable (e.g., comparing audiences that are not the same).
Outdated benchmarking data often fails to capture more recent substantial changes in the U.S. consumer landscape and the introduction of Generation Z to the consumer marketplace.
Inefficiencies in the benchmarking process are addressed by using the same audience and methodologies across social platforms. Data and information gleaned from surveys and behaviors of consumers come from a single source. In addition, results from campaigns focus on the past three years, creating recency and relevancy.
Calculating benchmarks are based on campaigns no further than March 2021. The median lift score is calculated using the difference between the exposed group and the control group.
Different categories are considered when specific benchmarks are calculated. In addition, a threshold of 15 brands was implemented to create variety and statistical significance.
Audiences surveyed are opt-in and tracked using metered data to assess ad exposure and downstream data. Surveys are provided to exposed and matched control individuals to assess attitudinal changes. Additionally, surveys and behavior can be correlated.
Bill Harvey – Executive Chairman, Bill Harvey Consulting, Inc.
Sophie MacIntyre – Ads Research Lead, Meta
This research has two main objectives: to establish if there is evidence for distinct environment types within digital media and to understand implications for use of attention metrics. The study focused on mobile “in the wild” test using simulated media environments conducted and analyzed by Realeyes in partnership with Eye Square and Bill Harvey consulting. The study scoped to video ads on six media platforms—Meta (Facebook and Instagram), Hulu, Snapchat, TikTok, Twitter and YouTube—and three environment categories—Feed (Facebook, Instagram, twitter), Short Form (Facebook, Instagram and Snapchat stories, TikTok Takeover, Feed and TopView, and YouTube shorts) and Stream (Facebook InStream, Hulu Pre and Mid Roll, and YouTube Skippable and Non-Skippable). Study was conducted in three stages: pre-exposure survey followed by in-context view (ad visibility, skips scrolls, on-screen attention and reactions) and then post-exposure survey (examining brand recognition, ad recall, brand trust, ad liking and persuasion). A number of constants were held: 1) isolating effect of creative; 2) holding audience constant through randomization and isolation (each person sees only one ad at a time).
Main findings include:
Across digital environments relaxation is the key mode.
While attention norms vary between environments, brand recognition is comparable and delivers the same effect.
There were similar results across the funnel across different format types—apart from ad recall.
The effect of attention on brand outcomes differs across environments. More attention = higher ad recall and better brand recognition. Feed and short form environments saw the same ad recall with shorter average time and fewer long exposures.
Marketers should tailor creative to take advantage of different second by second attentive profiles. In stream there is a more constant level of attention across creative—more time to tell the story. In Feed and Short Form—a conscious decision to concentrate your attention—you’re enjoying the fun of choosing; in stream it’s like being in the back seat of the limo. Less choice. So different types of attention.
Across all environments, consumer attentive behavior decreases with increasing familiarity.
When you do a study like this it is important to filter out those people who aren’t typically users of the platform. Newbies don’t know how to escape the ads.
In conclusion, effectiveness of attention varies across environment. Caution is needed in averaging across categories. Attentive behavior evolves across time.
How can we distinguish between positive and negative attention? Perhaps through facial attention measurement.
How to understand forced vs. earned time?
What are the tradeoffs between attention, brand outcomes and cost?
How can we layer on emotion data and/or additional data?
There are different profiles of consumer behavior across environments.
Despite this, brand outcomes are comparable, suggesting each environment attains value in a different way.
Attention has a different relationship with outcomes across environments and users.
Tina Daniels – Managing Director, Agency & Brand Measurement Analytics, Google
Nicole Gileadi – Global Product Lead, Google
In this session, Tina Daniels and Nicole Gileadi examined Google's principles for charting the course for third-party cross-media audience measurement. Tina acknowledged more third-party measurement companies were expressing interest in working more closely with Google, given their stature as the world's largest video provider. In her discussion, she acknowledged that this interest generated the need for Google to create a set of principles to offer to both measurement companies and key clients to guide the process. After reviewing these principles Tina and Nicole held an open discussion regarding these principles. Topics of the discussion included premium and high-quality content, long-form versus short-form video and the measurement of this content. In addition, Nicole touched on the importance of content and the context surrounding an ad. Other areas included the idea of exposure metrics (e.g., Where is my audience? Did I reach them?) in addition to providing signals to conduct an impact analysis.
The following are the five principles Google shared with the industry, to act as guidance for third-party measurement companies interested in working with Google:
Google expects measurement companies to be comprehensive, meaning a holistic view of audiences across all platforms.
Measurement should be fair and comparable.
Privacy-centricity is extremely important. Only privacy-centric solutions can meet consumer expectations and be durable for marketers in the long term.
Independent & Trustworthy, meaning both objective and transparent, ideally with third-party endorsement like the MRC.
Measurement solutions must be actionable for advertisers.
The struggle that the advertising and marketing industry is currently having is that "there is no universal definition of content quality that is easily measurable in cross-media systems."
"Content quality is being used as this proxy for content impact." For example, "What is the impact of the content on my brand equity, my campaign objective, by marketing or business objectives?" All of these factors are specific to the marketer, the brand and the campaign.
When it comes to exposure metrics, advertisers and marketers should be consistently counting impressions across all channels, "because you need to count things to value them."
Tom Weiss and Megan Daniels of MarketCast introduced a new metric in their break-out session: brand effect resonance. This product evolved from one called Brand Effect which uses a combination of survey (15,000 consumers per day) and behavioral data across linear, social, digital (popular websites) and streaming. First developed by IAG, then owned by Nielsen and then Phoenix, Brand Effect stands as the main engine of the brand-effect resonance rating system, which was created to overcome gaps in reach measurement. They believe it can now isolate and show exactly how content and platform quality impacts advertising performance. Resonance here is defined as how well people remember the ad, how well they understood the creative and the message and how well they can link it back to the brand. Ad resonance measurement is said to be able to isolate the impact of content and platform on ad recall.
Reach measurement has drawbacks: even though it acknowledges audience size, it does not measure ad impact. Reach measurement treats all impressions as equal regardless of the content, and while platform and the quality of the content matter, their impact cannot currently be proven.
Ten percent of respondents to a recent MarketCast survey found that in a digital environment, ads associated with premium content (professionally produced vs. user-generated content (UGC)) had more credible messaging.
Sixty-two percent who watched a premium clip remembered the ad and the brand correctly versus 49% of those who viewed a UGC clip. In addition, 56% of premium clip viewers thought the ad spoke directly to them, versus 43% who said an ad in a UGC clip spoke to them.
The resonance score is built from CTV/streaming ACR data, opportunity to see (OTS) surveys for linear TV, tags on digital ads (known exposure) and on social—OTS through memory triggers and surveys.
Using the always on approach, they survey people about ads they have seen on TV, CTV or social within a 24-hour window. The big pool of participants is used to normalize all other factors, so to see what network or platform would best suit a client’s ad.
Ad resonance rating is meant to become an interoperable metric that complements traditional reach and frequency measures and other currency metrics.