RESPONSE
TO AUDIBLE
ASSIGNMENT
APRIL 27, 2015
CONTENTS
Section 1 The Challenge Section 2 Defining the Audience Section 3 Media Approach Section 4 Channel Planning Section 5 Modeling Section 6 Innovation Roadmap Section 7 Test & Learn Section 8 Reporting Section 9 Technology Solutions Section 10 Measurement and Forecasting //
AUDIBLE
417K Active Listeners
$50m Budget
//
THE CHALLENGE
2012 BRAND STUDY SHOWED ROOM TO GROW 2012 112.8 MILLION
Currently listening to audiobooks
PROSPECT POPULATION
Not currently listening but will consider Not currently listening and won’t consider
AUDIBLE UNAWARE
AUDIBLE AWARE
100%!
LISTENING-TO-CONSIDERATION SPECTRUM
90%!
18.23% 14.2 MM
22.63% 7.9 MM
80%! 70%! 60%!
14.37% 11.2 MM 23.21% 8.1 MM
50%! 40%! 30%! 20%! 10%!
54.15% 18.9 MM
67.39% 52.5 MM
0%! 0%!
10%!
20%!
30%!
Source: Audible Custom Study Findings, Jan 2012, Brand Asset Consulting
40%!
50%!
60%!
70%!
80%!
90%!
100%!
//
POSITIVE GROWTH SEEN ACROSS THE BOARD 2015 182 million
Would advocate audiobooks
PROSPECT POPULATION
Would maybe advocate audiobooks Would not advocate audiobooks
AUDIBLE UNAWARE
AUDIBLE AWARE
100%!
14.75% 16.7 MM
WILLINGNESS TO ADVOCATE AUDIOBOOKS
90%! 80%!
33.59% 37.9 MM
70%! 60%! 50%! 40%! 30%!
55.77% 38.6 MM
22.36% 15.4 MM
20%! 10%!
21.85% 15.1 MM
51.84% 58.6 MM
0%! 0%!
10%!
20%!
Source: Primary Research, M&C Saatchi Mobile - Web Survey 2015
30%!
40%!
50%!
60%!
70%!
80%!
90%!
100%!
//
POSITIVE GROWTH SEEN ACROSS THE BOARD 2015 182 million
Would advocate audiobooks
PROSPECT POPULATION
Would maybe advocate audiobooks Would not advocate audiobooks
AUDIBLE IMPROVEMENT IN AWARENESS
7%
100%!
WILLINGNESS TO ADVOCATE AUDIOBOOKS
90%! 80%!
2012
70%!
19.22%
Improvement in consideration
60%! 50%! 40%! 30%! 20%!
32.30%
10%!
15.55%
Improvement
Improvement in value sentiment
0%! 0%!
10%!
20%!
Source: Primary Research, M&C Saatchi Mobile - Web Survey 2015
30%!
40%!
50%!
60%!
70%!
80%!
90%!
100%!
//
MEDIA NEEDS TO CONTRIBUTE ON TWO FRONTS Would advocate audiobooks Would maybe advocate audiobooks Would not advocate audiobooks AWARENESS
100%!
Make them aware of Audible, push to trial
80%! 70%!
Push to trial if not currently a customer
Make them aware of Audible, convince them of the value of audiobooks, push to trial
60%! CONSIDERATION
WILLINGNESS TO ADVOCATE AUDIOBOOKS
90%!
50%! 40%! 30%!
Convince them of the value of audiobooks, push to trial
20%! 10%!
Not a focus
Not a focus
0%! 0%!
10%!
20%!
Source: Primary Research, M&C Saatchi Mobile - Web Survey 2015
30%!
40%!
50%!
60%!
70%!
80%!
90%!
100%!
//
DEFINING OUR AUDIENCE
OUR TARGET MARKET (Category involved and open) are defined by curiosity
Smartphone owners (182m) Music consumers (155m) Readers (103m)
Audiobooks user (34m)
Podcast listeners (46m)
TV Streamers (63m)
Source: GWI, MRI & ComScore, Pew, 2014-5
//
THE SWEET SPOT A combination of behavior and access
Curious
Digital
//
FINDING THE RIGHT AUDIENCE Suggest focusing on audiences with right behaviors and scale
Curious
Commuters (71m)
Business Travelers (30m)
At home parents (36m)
Fitness Enthusiasts (70m)
Digital Cord Cutters (20m)
Empty nesters (20m)
Source: MRI 2014
//
KEY SURVEY & RESEARCH FINDINGS 57% of podcast listeners advocate listening to audiobooks. 87% of podcast listeners are between the ages 18-45. 71% of 18-45 year olds subscribe to digital content services. People who enjoy reading books are 16x more likely to advocate audiobooks than people who don’t enjoy reading. Curious people are 34x more likely to advocate audiobooks than people who are not curious. 62% of audiobook buyers choose audio over other book formats because they can listen to the book in the car (Source: Audio Publishers Association ID 249827)
These three audiences exhibit traits of curiosity and digital savvy and all share in behaviors that provide opportunity to achieve flow often.
TOTAL ADDRESSABLE SIZE - 171M People BUSINESS TRAVELERS (30M)
FITNESS ENTUSIASTS (70M)
40% of podcast listeners listen to podcasts when they travel 87% of podcast listeners that listen to podcasts during travel subscribe to paid digital content services.
71% of podcast listeners listen to podcasts when they exercise.
51% of podcast listeners listen to podcasts while they commute.
80% of podcast listeners that listen to podcast during to exercise subscribe to paid digital content services.
88% of podcast listeners that listen to podcasts while commuting subscribe to paid digital content services.
50% of podcast listeners that listen to podcasts during exercise identify as curious people.
50% of podcast listeners that listen to podcasts while commuting identify as curious people.
55% of podcast listeners that listen to podcasts during travel identify as curious people
(SOURCE: AUDIO PUBLISHERS ASSOCIATION ID 249827)
COMMUTERS (71M)
//
MEDIA APPROACH
MEDIA APPROACH
CU R I O U S PE O PLE WA NT TO B E I N A S TATE O F F LOW: M A K I N G TH E M O S T O F E V E RY M OM E NT
//
MEDIA APPROACH 2012 BRAND SURVEY REVEALED THAT AUDIBLE IS UNLIKELY TO BE A SUBSTITUTE FOR OTHER ENTERTAINMENT
Not perceived as value for money Perceived ‘skill’ at listening to audiobook Audiobooks not a substitute for books or entertainment Not perceived as value for money Source: Audible Custom Study Findings, Jan 2012, Brand Asset Consulting
} }
Perception highly likely to change when using the service
Give people a reason to listen, not a context
//
MEDIA APPROACH
Universal Truth Curious people want to be in the state of flow
Brand Truth
+
Audible complements your passion, putting you in a state of flow
//
MEDIA APPROACH
COMPLEMENT YOUR FLOW //
MEDIA APPROACH
‘COMPLEMENT YOUR FLOW’ IMPERATIVES MEDIA
CREATIVE
Target curious people
Identify their passion & align with it
Be in moments of flow or when people are considering flow
Be a complement to their passion not a substitute
Be alongside their passions
Push to free content such as a trial
//
MULTI-CHANNEL MARKETING MULTI-CHANNEL MARKETING IMPROVES EFFICIENCY AND VOLUME We recommend a multi-channel approach
Efficiency
Better Efficiency
Direct Response, one channel
Direct Response & brand, multiple channels Larger Volume
Spend
//
RESEARCH
TV
DIGITAL RADIO
OUTDOOR & EXPERIENTIAL
PODCAST SPONSORSHIP
DIGITAL VIDEO (MOBILE & DESKTOP)
DR: MOBILE, PAID SOCIAL & DESKTOP
INSIGHT
Audible is competing in the entertainment category not just audiobooks
Our audience’s radio consumption is mostly digital: 80% of their audio consumption
Our audience’s can be identified within certain locations in seeking flow
Podcasting usage is growing: it’s the nearest proxy to audiobooks. Currently at 46m listeners in US alone
Audiobooks considers are more likely to be digital content subscribers
Mobile, Paid Social & Desktop are a proven acquisition channels and an excellent catch all
APPROACH
Align with TV properties through targeting & sponsorship to gain cultural relevancy
Concentrate on online digital radio only
Be in moments of flow with outdoor and experiential
Concentrate on online digital radio only
Align with their interests through targeted digital video
Continue investment with joined up approach with other media
Sources: Edison Research, 2015
//
GENERATE & HARVEST DEMAND WITH MULTI-CHANNEL APPROACH
TV & DIGITAL RADIO Culturally relevant positioning Outdoor & Experiential Be in moments of flow
Podcast Sponsorship Drive consideration in moments of flow Digital Video (Desktop & Mobile) Drive consideration & conversion
DR: Mobile, Paid Social & Desktop Efficient conversion
DEMAND GENERATION 32%
HYBRID 14%
DEMAND HARVEST 54% //
SUGGESTED BUDGET SPLIT
8
2 6
27
2 5
TV Podcast Sponsorship
Digital Radio
Outdoor & Experiential
Digital Video (Desktop & Mobile)
DR: Mobile, Desktop & Paid Social //
EXAMPLE EXECUTION
TV
Podcast Sponsorship
Digital Radio
Digital Video (Desktop & Mobile)
Outdoor & Experiential
DR : Mobile, Paid Social & Desktop
//
DEMAND GENERATION
CHANNEL PLANNING
TV TEST
TV & DIGITAL RADIO
OUTDOOR & EXPERIMENTAL
DEMAND HARVEST
PODCAST SPONSORSHIP
DIGITAL VIDEO
DR: 80% MOBILE & PAID SOCIAL, 20% DESKTOP
JULY
AUGUST
SEPTEMBER
OCTOBER
NOVEMBER
DECEMBER
//
CHANNEL PLANNING
Prioritize budget for ATL based on commute times (30 mins+)
//
CITY PRIORITIZATION
Seattle San Francisco
Chicago
Atlanta
Los Angeles Washington New York
//
CHANNEL PLANNING
Split Budget by Population for ATL Activity
POPULATION Popula.on
WEIGHTING Weigh.ng
TV TV
DIGITAL RADIO Digital Radio
OUTDOOR & Outdoor & EXPERIENTIAL Experien.al
New York
8,405,837 8,405,837
0.48 0.28
3.82 TBC
0.96 0.56
3.36 TBC
Washington
646,449 646,449
0.04 0.02
0.29 TBC
0.07 0.04
X TBC
Chicago
2,718,782 2,718,782
0.15 0.09
1.24 TBC
0.31 0.18
1.09 TBC
Atlanta
447,841 447,841
0.03 0.01
0.20 TBC
0.05 0.03
X TBC
Los Angeles
16,370,000 3,884,307
0.54 0.22
TBC 1.77
1.09 0.44
TBC 1.55
San Francisco
837,442 837,442
0.03 0.05
TBC 0.38
0.06 0.10
TBC X
SeaFle
652,405 652,405
0.02 0.04
TBC 0.30
0.04 0.07
TBC X
Total
30,078,756 17,593,063
11
816
2 2
65
TERRITORY Territory
Digital Video Mobile)
MOBILE, Mobile & DESKTOP, Desktop & Paid & PAIDSocial SOCIAL
2.00
5.00
20.00
22
55
20 27
PODCAST DIGITAL VIDEO Podcast (Desktop & SPONSORSHIP (Desktop & Mobile) Sponsorship
2
5
27
//
MODELING
HOW ECONOMETRICS MODELING FITS
BEHAVIORAL ANALYSIS
AUDIENCE INSIGHT/SIZING 0%
Mobile
20%
Desktop
40%
TV
Radio
60%
OOH
80%
PS
100%
Testing
MODELING
//
ECONOMETRICS MODELING
DATA
• Ensure data is validated • Identify noise from signals
MODELS
• Apply data to models that will provide actionable insight • Validate models through constant refinement through test & learns
DATA COLLECTING/ SOURCING INFRASTRUCTURE
• Confirm data needs • Validate data sources • Ensure ongoing data collection process
MARKETING/ INSIGHTS TEAM
• Build models to address key business questions • Provide actionable insights to guide channel planning
//
WHICH CHANNELS DRIVE THE BEST RESULTS? Marketing mix models are mathematical frameworks leveraging big data to identify which channels drive the best outcome/results in context of the business environment. Models allow marketing teams and senior leaders to leverage data to guide planning:
It also quantifiably demonstrates how marketing works together with other business factors to produce the outcome.
• Run simulations based on different marketing budgets and channel allocation strategies.
Owned Media/PR
Earned Media/WoM
Audible App/ Website Performance Brand/ Consideration Metrics
Business Divisions
Marketing
• Assess true attribution across channels for historical and planned campaigns.
OUTCOME: Audible Listeners
Seasonality/ Environment
//
ANALYTICS TO PARSE THROUGH BIG DATA
Intuition is not enough. With a plethora of data available, we need a quantifiable methodology to assess combination of controlled and noncontrolled factors on upper/ lower funnel acquisition streams. Marketing mix modeling leverages big data to attribute outcome to marketing by accounting for all combinations of factors impacting the business. That insight enables us to plan effectively across channels as well as forecasting results based on key future assumptions on controlled and noncontrolled factors using historical data.
Brand Metrics!
WoM!
Facebook / Twitter!
PR!
Paid/Organic Search!
Media!
Promotion!
Economy!
App/Web TrafďŹ c!
Trade!
Price!
Seasonality!
In-App Acquisition! Web Acquisition!
//
ANALYTICAL ROADMAP Start small: Ensure data collection and analytics plan is a pragmatic solution that starts small and builds out sophistication as results are proven through test & learns. Since it takes time to source all the data sources, provide files with proper dimensions, an effective approach would be to address specific marketing strategy and business questions in a tiered approach.
ATTRIBUTE ACQUISITIONS ACROSS MARKETING CHANNELS AND BASE
Management Consultants
• McKinsey • Accenture
OPTIMIZE PORTFOLIO OF AUDIENCE SEGMENTS AND THEIR CHANNELS
Agent-Based Modeling
• Marketing Revolution • Nielsen
• ThinkVine
OPTIMIZE CREATIVE MESSAGING
Regression-Based Modeling IDENTIFY IMPACT OF LONG-TERM BRAND EQUITY
• Adometry • VisuallQ
• MMA
//
MODELING PROCESS Depending on the analytics team, there are different approaches to modeling. Core to all models is leveraging historical data to create response curves representing the impact of each marketing variables/factors in context to external factors. Should there be limited data, modelers apply Bayesian techniques that takes human intuition/expertise and other qualitative information to guide parameter estimates to fill in data gaps. DATA
MODELS
CHANNEL INSIGHTS
OUTPUT
10-16 WKS
4-6 WKS
2 WKS
4 WKS
Paid Media Business Operations/Core
Direct
Owned Media App Download Web Visits Search Volume Brand Sentiment
Indirect OUTCOME - AL’s
WoM/Earned Media Social Media Competitive Activity Economy Seasonality
MARKETING/ BUSINESS Marketing and business factors that directly drive acquisitions BASE Level of organic demand for the product without marketing
Historical Attribution Analysis by Channel Optimized Channel Allocation Recommendation to guide planning
Finalize channel allocation strategy and Execute
External
Holidays, ‘Acts of God & relevant events
//
MODELING VARIABLES Identify data needs/sources and collect, validate, and aggregate all data files ranging from controlled and uncontrolled factors into ONE STACK for dashboard and modeling purposes.
1st Party
CONTROLLED FACTORS
PAID MEDIA
UNCONTROLLED FACTORS
NON-MEDIA CONTROLLED FACTORS
OWNED/ EARNED MEDIA
National & Local TV National & Local Radio Print Outdoor Sponsorships
Brand Awareness, consideration, attributes, etc.
Social Sentiment on FB, YT, Instagram and other social networks
Affiliate Online Display Mobile Direct Mail Email Paid Search
3rd Party
Price & Discounting Promotions
Number of likes, followers, shares, etc.
APP/WEBSITE PERFORMANCE App Downloads/ Engagement Audible Website Traffic Organic Search Volume [GQV]
ENVIRONMENT Competitive Activity Macro Indicators GDP, SP500 Index, Unemployment, Weather
Trials
Audience segment geolocation data
Take Rate
Seasonality
Trial to Member Conversion Rate Customer Type
Cultural/ significant events – Olympics, Elections, etc.
//
DATA DETAILS: EXAMPLE Models leverage a data stack based on quantitative metrics expressed in a flat file [csv/xls/SAS] Data is usually expressed as weekly or monthly metrics.
DATA DIMENSION CREATIVE AND/OR CAMPAIGN
WEEK/MONTH
GRP, IMPRESSION
MARKET/DMA/ ZIP CODE
IMPRESSION
IMPRESSION, CLICK, CLICK-THRU, CONVERSION
Online Display
SPEND
# SENT
# SENT, CLICK, CLICK-THRU, CONVERSION
Direct Mail
National & Local TV Outdoor Mobile Sponsorships National & Local Radio Affiliate
Paid Search
1st Party
3rd Party
//
SAMPLE DATA After data collection, we process the data and validate the results to ensure accuracy as the quality of the models are dependent on good data. We create visual charts to trend out the time-series data to ensure all outliers, missing data, and other anomalies are intentional before leveraging the set for the models.
TRIALS 60
50
40
30
20
10
0 D J F M A M J J A S O N D J F M A M J J A S O N D J F M A M J J A S O N D J F M A M J J A
WEB STORE VISITS 3,500
3,000
2,500
2,000
1,500
1,000
500
0 D J F M A M J J A S O N D J F M A M J J A S O N D J F M A M J J A S O N D J F M A M J J A
//
BUILDING MODELS Models take historical data into account and create response curves representing the impact of each marketing variables/factors. Should there be limited data, modelers apply Bayesian techniques that takes human intuition/expertise and other qualitative information to guide parameter estimates to fill in those gaps. During data analysis, we separate the noise from signals to identify correlations and ad stock/lag effects to develop statistical models for prediction of responses at desired levels of granularity for multiple metrics. Models combine stimulation-response regression models with time series models that describe influence of macro variables (seasonality, trends, external factors) as well as short term market dynamics.
MODEL PREDICTIONS AT DESIRED LEVEL OF GRANULARITY
MODEL OVERALL PREDICTIONS
//
ENSURING BEST FIT Validating model accuracy ensures high confidence when working with insights. Visualizing the mean absolute percent error (MAPE) between modeling outcome and actual data provides transparency on “fit� regarding how well outcome represents actual figures.
In Sample MAPE = 5.3% Out of Sample MAPE = 7.1%
In general, <5% is an excellent fit; 5% to 10 is a strong fit; 10 to 15% is a moderate fit
out of sample (holdout) period
In sample period
2,500,000 ! 2,000,000 !
Actual!
Fitted!
Forecast
Forecast!
1,500,000 ! 1,000,000 ! 500,000 ! - ! 2008!
2009!
2010!
2011!
2012!
2013!
2014!
//
TIME SERIES BASE VS MARKETING ATTRIBUTION Marketing Mix Models attribute outcome between overall marketing/business factors from base. This visualization illustrates the contribution of marketing to the bottom line.
Marketing/Business • Marketing and business factors that directly drive acquisitions • Contribution is also attributed by specific channels and other shortterm factors Base • Factors that do not change very much [brand perceptions/awareness, etc.] – Long-term factors • Level of organic demand for the product without marketing
//
ACTIONABLE INSIGHTS TO GUIDE PLANNING Visualize the % attribution of revenue across channel against % spend and overlay the ROI Index to identify which channels we should invest more, optimize, and re-evaluate or potentially drop from plan. This is used to guide planning so we know which channels are most effective in driving revenue. ROI Index = %Attribution / % Spend x 100
Invest More
30%
4.0
Re-evaluate / Drop
Optimize
3.5
25%
3.0
20%
2.5 2.0
15%
1.5
10%
1.0
5%
0.5 0.0
% Spend
U E C
AT AL O
G
O TD O U O
SP EW N
R
R
ED M ER O TH
% Attribution
AP E
IA
TV
BI L O M
IN
SE
RT
E
S
0%
ROI Index
//
RESULTS Digital/print publisher suspected that last click attribution was misrepresenting the effectiveness of their ATL campaigns by not attributing enough credit for driving subscriptions. Leveraging the primary and third-party data on marketing, business, and exogenous factors, we built a marketing mix model to first identify the contribution of total marketing and its channels to total subscribers so that we can inform the optimal mix for planning. Models forecasted that shifting the media mix could potentially increase total subscribers by between 6-10%. Upon presenting this insight, client adjusted their media plan and was able to accrue a 8% increase in total subscribers.
HISTORICAL MIX
Other Media
Mobile
OPTICAL MIX
Other Media
Inserts
National Press
Mobile
Outdoor
Inserts
National Press
Brand TV
Outdoor
Catalogs Catalogs
Brand TV
//
INNOVATION ROADMAP
WHY DO WE NEED IT? The “70/20/10 Approach” maps media tactics into a structured approach that drives KPIs. “The next new thing” is tested in a control environment, and only strong performers move into on-going usage.
70% DRIVE EFFICIENCY
Evergreen media: proven media tactics that positively improve AL-generation efficiency. On-going optimizations within this 70% continue to improve the overall cost per AL.
20% LEARN MORE
Test and learn media: tactics that show signs of positive contribution to AL generation but need further iteration to improve or prove out efficiency and scalability. These tactics can also be efforts to further scale evergreen media.
10% BREAK NEW GROUND
The very edge of media: tactics that are entirely untested within Audible or even in the market. These tactics are kept on a very short leash and carefully monitored to avoid waste; they are either moved to the test-and-learn phase or eliminated until it makes sense to re-test.
//
70 / 20 / 10 APPROACH TO INNOVATION
SCALE & EFFICIENCY
INNOVATION
70%
20% LEARN MORE
BREAK NEW GROUND
Evergreen
What we know works
Test & Learn Innovating off of what works
Innovation Brand new ideas
Drive Acquisition
Increase Scale
Break New Ground
Drive Brand Awareness & Consideration
Test New Partners & Audiences
Test New Strategies
Generate Test & Learn Scenarios
Test New Tactics
First-to-Market Opportunities
DRIVE EFFICIENCY
10%
//
70% EXAMPLES
70%
DRIVE EFFICIENCY
Evergreen
What we know works
DRIVE ACQUISITION • Generate the bulk of ALs through consistent partner performance that has been well-measured • Grow scale with ongoing optimizations of the best performing targets and partners while allocating budget away from poor performers
DRIVE BRAND AWARENESS & CONSIDERATION
GENERATE TEST & LEARN SCENARIOS
• Accelerate target audiences through the funnel with above-theline tactics that educate audiences on Audible’s product and value proposition
• Identify the strongest tactics that deserve to scale via test-and-learn campaigns in the 20% bucket
• Ex. Tent-pole TV
• Ex. Finding new partners, similar programming, etc
//
20% EXAMPLES
20% LEARN MORE
Test & Learn
Innovating off of what works
TEST NEW PARTNERS
TEST NEW AUDIENCES
• Test new partners within tactics that have proven to be effective
• Finding new, untapped audiences through analysis of currently performing targets and demographics
• Ex. A/B testing partners, similar to current performers, for a set period of time in order to determine viability
• Ex. Testing look-alike audiences, new targets through cohort analysis, new behavioral types, TV demo targets, etc.
TEST NEW TACTICS • Testing new tactics within existing strategies that have proven to be efficient • Ex. Podcast media/sponsorships, content creation, digital & traditional program sponsorships, audience retargeting, sequential messaging, etc.
//
10% EXAMPLES
10%
BREAK NEW GROUND
Innovation
Brand New Ideas
BREAK NEW GROUND
NEW STRATEGIES
• Investing in opportunities which have not been tested or tried by Audible, or other advertisers in the category
• Testing entirely new strategies which are not derived from existing campaigns that can generate new insights
• Ex. Develop an interactive OOH ad which allows users to experience Audible within key markets
• Ex. Sponsor one or more episodes of a TV series
FIRST-TO-MARKET OPPORTUNITIES • Trialing new technologies, products, and opportunities (some with beta exclusivity) • Ex. Craft a campaign to participate in beta of the new Facebook DSP messaging, etc.
//
2H 2015 INNOVATION THOUGHT STARTERS
DIGITAL OOH & EXPERIENTIAL
PODCAST CONTENT CREATION
SOCIAL MEDIA CONTENT STRATEGY
• Simulate Audible moments for commuters in transit terminals. • Sponsor marathons and other events where Audible can promote
• Bring the Audible experience to another platform that complements curiosity. • Launch a podcast series that extends the reach of your content.
• Continue to feed the curiosity of the digitally native Audible listener. • Provide bite-sized content across existing and emerging social media platforms to keep them hungry.
“flow.”
//
TEST & LEARN
TESTING PROCESS
Select a hypothesis for testing
Choose the right metrics
Establish media buying rules
Analyze data / results
//
TEST & LEARN TEST & LEARN APPROACH GUIDES ON-GOING CAMPAIGN
TEST & LEARN + INNOVATION
EVERGREEN
30%
70%
Online
Mobile
Traditional
Partners, Tactics & Targeting (planned) Trial Volume Cost-per-Trial AL Volume Cost-per-AL Click-to-AL % Trial-to-AL % Engagement
contextual insights
demographic insights
behavioral insights
Mobile
Traditional
Partners, Tactics & Targeting (applied)
Organic Lift in Trials & ALs Lift in DR metrics (CVR / CTR) Lift in Trial-to-AL take rate Lift in site visitation and/or GQV
day-part insights
Online
location insights, etc.
Trial Volume Cost-per-Trial AL Volume Cost-per-AL Click-to-AL % Trial-to-AL % Engagement
Organic Lift in Trials & ALs Lift in DR metrics (CVR / CTR) Lift in Trial-to-AL take rate Lift in site visitation and/or GQV
evaluation, iteration
//
DIGITAL EXAMPLE PARAMETERS
MEASUREMENTS
• targeting : run of network
• Inventory quality : Cost-per-trial & click-to-trial % as primary indicators
• pacing : even, week-to-week • flighting : 30-days • purchase model : eCPC or eCPM, static bid cap across all partners
• scale : trial volume as primary indicator • efficiency : comparative CPT (per category of partner) • misc. to note : capabilities, specializations
• budget : $6,250 per partner, per week
• evaluation : week-over-week
• creative : generic, cross-segment, cross-partner
• approach : partners categorized by potential scale & efficiency
July 2015
(COMPARATIVE)
EVEN PACE, EVEN SPEND ACROSS ALL PARTNERS
PARTNER 1
click-to-trial % & trial volume
EX : CATEGORY 3
PARTNER 2
click-to-trial % & trial volume
EX : CATEGORY 4
PARTNER 3
click-to-trial % & trial volume
EX : CATEGORY 2
PARTNER 4
click-to-trial % & trial volume
EX : CATEGORY 2
PARTNER 5
click-to-trial % & trial volume
EX : CATEGORY 1
PARTNER 6
click-to-trial % & trial volume
EX : CATEGORY 3
//
+SCALABILITY
PARTNER TEST – COMPARATIVE CATEGORIZATION
PARTNER 1 Q2 – SCALABLE, INEFFICIENT
PARTNER 4 Q1 – SCALABLE, EFFICIENT PARTNER 6
PARTNER 2 +EFFICIENCY
PARTNER 5 Q3 – NOT SCALABLE, INEFFICIENT
PARTNER 3 Q4 – NOT SCALABLE, EFFICIENT
//
TESTING TRADITIONAL CRITERIA & SELECTION
3 TEST CITIES NE
SW
3 CONTROL CITIES 3 TEST CITIES 3 CONTROL CITIES 3 TEST CITIES
MW
SE
3 CONTROL CITIES
3 TEST CITIES 3 CONTROL CITIES
NW
REGIONALLY REPRESENTATIVE CITIES COMPARABLE IN: POPULATION DEMOGRAPHICS COST OF MEDIA BRAND PERCEPTION
3 TEST CITIES 3 CONTROL CITIES
//
TESTING TRADITIONAL ABOVE THE LINE MEASUREMENT APPROACH
3 TEST CITIES NE
SW
SE
ABOVE THE LINE ACTIVITY
POST-CAMPAIGN BENCHMARKS
3 CONTROL CITIES 3 TEST CITIES 3 CONTROL CITIES 3 TEST CITIES
MW
PRE-CAMPAIGN BENCHMARKS
3 CONTROL CITIES
3 TEST CITIES
BRAND SURVEY
BRAND SURVEY
AVERAGE CLICK THROUGH RATE
AVERAGE CLICK THROUGH RATE
AVERAGE CLICK-TO-TRIAL RATE
LOCAL TV, E.G.
AVERAGE CLICK-TO-TRIAL RATE
AVERAGE TRIAL VOLUME
AVERAGE TRIAL VOLUME
SITE TRAFFIC
SITE TRAFFIC
3 CONTROL CITIES
NW
3 TEST CITIES 3 CONTROL CITIES
//
WHAT WE NEED FROM AUDIBLE PRE-TEST CAMPAIGN BENCHMARKS
BENCHMARK AVERAGE CLICK-THROUGH RATES BY CITY (PAID SOCIAL & PAID SEARCH) BENCHMARK AVERAGE CLICK-TO-TRIAL RATES BY CITY (PAID SOCIAL & PAID SEARCH) BENCHMARK AVERAGE TRIAL VOLUMES BY CITY BENCHMARK AVERAGE ORGANIC SITE TRAFFIC BY CITY BENCHMARK AVERAGE ORGANIC SEARCH VOLUME BY CITY
//
CORRELATION MEASUREMENT INDIRECT MEASUREMENT OF TRADITIONAL & DIGITAL MEDIA IMPACT (ILLUSTRATION PURPOSES ONLY)
• Ongoing analysis of trends and correlations with media spend to further measure media impact beyond direct attribution
• Measuring the effectiveness of media spend which cannot be directly attributed to conversions
• Leveraging output to craft test & learn scenarios to prove/disprove correlation hypotheses
//
REPORTING
REPORTING Dashboard which combines easy-to-use pivot tables with visual representations of performance from both topline and granular perspectives. Each view can be easily adjusted to provide the right level of detail with the correct visual representation based on what is required for any given individual or situation. Detailed write-ups delivered on a weekly, monthly, and quarterly basis which detail in plain English why media is performing as it is, optimizations made, the impact those optimizations have had, and comprehensive rationale behind future planning based on those factors.
EXECUTIVE SUMMARY Topline view of spend and performance by channel When you need a simple, quick view into the overall health of each channel
PACING SUMMARY Spend pacing by channel and partner When you need to see where budget reallocations are necessary across channels
DAILY/WEEKLY NETWORK PERFORMANCE Detailed spend and performance metrics by channel & partner When you need a detailed view into all media & performance metrics available
//
EXECUTIVE SUMMARY
//
PACING SUMMARY
//
DAILY/WEEKLY/MONTHLY SUMMARY
//
CROSS-CHANNEL DASHBOARD
Regional Breakouts
Offer/Creative Breakouts
Flight Breakouts (Quarter, Month, Year, etc)
//
CROSS-CHANNEL REPORTING
//
TECH SOLUTIONS ECOSYSTEM By selecting the technology partners that fit Audible best, M&C will create a technology ecosystem that improves ALgeneration efficiency by enhancing campaign management and increasing visibility.
Data flow Campaign Management
Analysis and Understanding
Ad Servers
Visualization Tools
Social Listening Multi-touch Platforms
PMDs In-app Attribution SDKs
Analytics Platforms
DMPs
//
TECHNOLOGY PHILOSOPHY There is no single technology ecosystem that fits all businesses. Every business is unique and deserves an ecosystem that consists of platforms individually selected to address its specific challenges and needs.
Marketplace Evaluation
Customized RFP
Final Selection
Who are the major players? What are their main strengths, weaknesses, and differentiators?
Work with Audible to create a set of evaluation criteria that is tailored to the needs and limitations of all relevant teams (BI, Marketing, CRM, etcâ&#x20AC;Ś)
With a shortlist of possible candidates, evaluation meetings are conducted before selecting the platform that best meets Audibleâ&#x20AC;&#x2122;s specific criteria
//
AD SERVERS Ad servers store creative units and systematically deliver them to sites and apps; they also enable delivery reporting and conversion tracking. Many ad servers now have both desktop and mobile capabilities.
KEY BENEFITS
CHALLENGES
CRITERIA
Increased creative rotation control
Pricing can be prohibitive for high impression plans
Number of partner integrations
Unified digital reporting
Conversion attribution for (m)web events
Pixel implementation required for conversion attribution
Often limited in-app conversion tracking
Mobile capability
Server uptime
Possible Partners
//
PMDS Preferred Marketing Developers (PMDs) use API integrations with paid-social platforms (Facebook, Twitter, etc.) to create a unified interface for managing paid-social campaigns. Many PMDs also add capabilities not available on the native platforms.
KEY BENEFITS
CHALLENGES
CRITERIA
Automated optimization and bid rules
Rapidly changing marketplace requires frequent evaluations
Number of paid-social API integrations
Comprehensive paid-social reporting
Reliance on one platform for all paid social
Number of third-party data partnerships
Added targeting options
Non-Facebook integrations often still nascent
Quality of support
Possible Partners
//
DMPS Data Management Platforms (DMPs) act as a centralized repository for an advertiserâ&#x20AC;&#x2122;s data. After ingesting data, DMPs are able to create audience segments that can be sent to integrated media partners for targeting with digital ads. DMPs also house all 1st and 3rd party data for ad hoc analysis and modeling. Access to this first-party DMP data can be restricted.
KEY BENEFITS
CHALLENGES
CRITERIA
Allows for leveraging of first-party data
Access to extensive Audible data required to be impactful
Quality of user interface
Marries first-party and third-party data
Requires frequent refreshing of data
Number of integrated partners
Enables highly targeted messaging
Needs strict privacy controls
Stability of platform
Possible Partners
//
IN-APP ATTRIBUTION SDKS In-app attribution SDKs are designed to track in-app conversion events and attribute them back to a mobile-media partner. Post-backs are sent to partners when they drive a conversion. These real-time data points allow media partners to maximize their optimization algorithms.
KEY BENEFITS
CHALLENGES
CRITERIA
Provides lastclick attribution of conversions
Requires implementation of the SDK into the app
Number of integrated media partners
Helps media partners optimize
No reliable impressionbased tracking
Reporting interface
Allows for budget allocation based on cost per action
Routine QA is required to ensure proper postbacks to partners
Ability to track multiple conversions
Possible Partners
//
MULTI-TOUCH PLATFORMS Multi-touch attribution platforms go beyond last-touch and last-click attribution. Instead, they track the consumer journey across all digital ad impressions that lead to a conversion. Models can then be built that assign percentages of a conversion to each touch point. Often these companies can give insights beyond digital as well.
KEY BENEFITS
CHALLENGES
CRITERIA
Allows for fuller understanding of the path to conversion
Attribution weighting depends on a subjective model
Ability to track crossdevice
Optimizations no longer reliant on last click
Cross-device attribution is often probabilistic
Deterministic attribution capability
Visibility into crossdevice attribution
Impression tracking not always available
Flexibility of model
Possible Partners
//
ANALYTICS PLATFORMS Analytics platforms can be deployed either in-app and/or on-site to provide a fuller picture of what consumers are doing on the site or in the app. These go beyond the conversion events tracked by attribution SDKs and/or ad servers, tracking user behaviors and metrics such as time spent.
KEY BENEFITS
CHALLENGES
CRITERIA
In-depth user activity visibility
Requires implementation of site code or SDK
Ease of implementation
Does not send data back to media partners
Reporting interface
Often requires separate tools for inapp vs. on-site
Ability to report in real time
Can be used for site / app optimizations
Assist with identifying issues that cause app crashes
Possible Partners
//
VISUALIZATION TOOLS Visualization tools create customizable graphical representations of data for both at-a-glance performance evaluations via dashboards and in-depth analysis via quickly generated graphs and charts. These tools would cover data from offline and online media.
KEY BENEFITS
CHALLENGES
CRITERIA
Simplifies complex data
Data privacy needs to be ensured
Flexibility for customization
Dashboards allow for quick views of performance
Needs to plug into many data sources
Ability to ingest volumes of data
Can easily generate graphics for presentations
Has to adapt to many different KPIs
Collaboration capabilities
Possible Partners
//
SOCIAL LISTENING TOOLS Social listening tools user “spider” technologies to crawl social platforms, such as Facebook and Twitter, to measure references to a brand. Many of these tools also have algorithms that gauge the sentiment of the social posts to report back on how audiences feel about the brand.
KEY BENEFITS
CHALLENGES
CRITERIA
Understand impact on social media
Doesn’t tie back to conversion events like an AL
Quality of sentiment analysis
Receive real-time feedback on brand perception
Requires socialmanagement expertise to parse data
Ability to archive posts
Increases social intelligence for better interaction with audiences
Need to routinely evaluate tools to account for new social platforms
Functionality to flag complaints
Possible Partners
//
MEASUREMENT
MEASUREMENT & OPTIMIZATION Every dollar spent on paid media should have a positive impact on the main business objective of driving efficient ALs. As such, the overall cost per AL will be constantly monitored to help determine the impact of ATL activity on the performance of BTL channels. However, not all media can be or should be judged against direct contributions to AL volume. Media should be evaluated in three cohorts:
Non Digital ATL
Digital ATL
$120 Cost Per AL
Digital BTL
Budget between each cohort is allocated based on correlations between spend within that cohort and lifts in overall ALs. Ex: if increased investment against Digital ATL drives brand lift but no movement in AL volume, that spend will decrease.
//
ATL NON-DIGITAL KPIS Channels: TV, OOH, experiential Every dollar spent on paid media should have a positive impact on the main business objective of driving efficient ALs. As such, the overall cost per AL will be constantly monitored to help determine the impact of ATL activity on the performance of BTL channels.
RANKED KPIs
IMPACT ON Brand Social Media Business Objectives
Awareness: consideration: GQV Mentions: sentiment: shares Overall AL and trial volume / cost per: Overall average member conversion rate
Site / App metrics
Take rate: visits: installs
Digital Campaigns
CTR: attributable CPAL, CPT, CPI
MEASURED BY Surveys / studies: Google Social listening tools 1st-party Audible data Analytics platforms / SDKs Ad servers / SDKs
//
ATL DIGITAL KPIS Channels: Desktop, mobile, podcast sponsorships, digital radio Budgets will be prioritized towards the ATL digital partners whose spend correlates best with lifts in brand social metrics. Being digital platforms, we can also see directly attributable impacts upon business objectives, such as ALs and trials, so those will act as a significant secondary metric.
RANKED KPIs BRAND (Awareness: consideration) SOCIAL (Mentions: sentiment: shares) TRIAL (Volume / Cost per)
MEASURED BY Surveys / studies; Google Social listening tools Server / SDK
AL (Volume / Cost per)
REF tag (source code)
Member conversion rate
Server / SDK & REF tag
Site / app metrics (Take rate: visits: installs)
Analytics platforms / SDKs
//
BTL DIGITAL KPIS Channels: Desktop and mobile As trial data can be passed in real-time to partners, it is the primary BTL digital KPI. ALs and member conversion rate are strong secondary KPIs, so partners who generate non-converting trialists are removed. BTL media is expected to have some minimal level of impact on brand and social, but those will not be viewed as key differences
RANKED KPIs TRIAL (Volume / Cost per) AL (Volume / Cost per) Member conversion rate
MEASURED BY Surveys / studies; Google Social listening tools Server / SDK
Site / app metrics (Take rate: visits: installs)
REF tag (source code)
SOCIAL (Mentions: sentiment: shares)
Server / SDK & REF tag
BRAND (Awareness: consideration)
Analytics platforms / SDKs
//
FIRST-PARTY DATA INFORMS MEDIA BUYING
DATA
MEDIA INSIGHT
ACTION EXAMPLE
AdvertiserID, DeviceID, CampaignID, Location/Geo
AdvertiserID, CampaignID, DeviceID, Cookie Data (Path-to-CVR), 3rd Party Data
AdvertiserID, CampaignID, DeviceID, CreativeID, Cookie Data, Source Code
AdvertiserID, CampaignID, DeviceID, Location Data, Third party Data
AdvertiserID, CampaignID, Timestamp, Location Data
DEMOGRAPHIC
BEHAVIORAL
CONTEXTUAL
GEO / LOCATION
DAYPART
Identify best performing combinations of age, gender, HHI, lifestyle, education, etc.
Identify typical user behaviors / paths between ad engagement and conversion.
Identify the types of media, partners, verticals to scale, and where to scale back.
Identify best converting and scalable markets
Identify the best performing times of day / days of week based on CVR & volume.
ACTIONABLE INSIGHT
ACTIONABLE INSIGHT
ACTIONABLE INSIGHT
ACTIONABLE INSIGHT
ACTIONABLE INSIGHT
Females between the ages 25-34 with a college education and $75k+ HHI have a higher propensity to convert to trial on blog and long-tail video inventory.
Users who have been exposed to above-theline tactics, such as videos and high-level sponsorships, have a higher propensity to engage with a direct response ad and ultimately convert to trial.
Audiences exposed to ads for sci-fi audiobooks within science, education, or gaming content verticals are more likely to convert to trial.
Audiences in major commuter cities (such as Los Angeles, New York, and Chicago), convert best off of ads for self-improvement and finance books within podcast content.
Audiences in major commuter cities (such as Los Angeles, New York, and Chicago), convert best off of ads for self-improvement and finance books within podcast content.
//
FORECASTING
IMMEDIATE
Build a media mix and plan based on the $120 cost per AL goal, using assumptions grounded in Audible 1st-party data and 3rd-party research. Key inputs
Overall ALs, average click-to-trial rates, trial volume by month, site visitation volume, current measured digital cost-pers, field research results, audience sizing data.
SHORT-TERM
Design tightly restricted exposed / unexposed tests of ATL spend. Correlations between that spend and lifts in overall cost-per AL will be used to forecast changes in efficiency based on re-distribution of budget. Key inputs
Results from media tests, overall ALs, trial volume by week, and site visitation volume, BTL media results (e.g. CPT).
LONG-TERM
Cross-channel econometric model. Key inputs
Two yearsâ&#x20AC;&#x2122; of the following historical data points - spend, activity level, creative, markets and campaign names across all channels, results from all campaigns, overall AL, trial, and site visitation volume.
//
625 Broadway, 6th Floor New York, New York 10012
THANK YOU!