image credit:

All This Conversion Data Is Great - But What’s It Really Worth?

View-through conversions are one of the trickiest metrics in digital marketing. Understanding the impact of activity is one of the USPs of biddable media, and with all that additional conversion data sitting there it’s tempting to assume that some of it is valuable, but how much?

Google defines view-through conversions as:

The conversions where a customer saw – but didn’t click – an ad before completing a conversion.

On face value it’s a simple metric but objections quickly become apparent. Customers have hundreds, sometimes thousands of interactions with ads every day, most of which lead to no related action. There’s a reason why the click is an important part of measuring user engagement and conversion. Who’s to say that simply seeing an impression of your ad had any impact on a customer’s final decision to buy? At the same time, it’s unreasonable to assume that everyone who didn’t click your ad, but did see your ad, didn’t end up buying from you. So what’s really going on?

We came up with a two-step test to understand the actual impact of view-through.

Putting It To The Test(s)

In collaboration with Google, we scoped out two tests for one of our key retail clients, with the goal being to understand the impact of YouTube activity on incremental View-through Conversions (VTC) for add to basket events, and purchases.

Test 1 (in collaboration with Google): a conversion uplift test

Users were shown either an ad from the retailer, or a ‘ghost’ ad for an alternative product/service. This was an impression-based test that measured the total overall conversion uplift for seeing an ad vs. not seeing an ad.

Test 2 (Katté test): a VTC incrementality test

A control group was shown the usual retailer YouTube activity. A test group were shown YouTube ads for beach holidays and landed on a page recommending the best beach destinations in the world. We added cross-negative video viewer audiences between campaigns to ensure there was no contamination of control viewers and test viewers.

Our test ads – Top 10 Beach Resorts from ‘Vue Thru’

We ran these tests for several weeks to ensure we captured the full purchase cycle for the client.

Getting to the number

In order to continue driving performance with available budget we did not use a 50/50 split of spend during the test period. This presented some challenges when normalising the data: how would test group conversion rate differ if impression volume was at the same level as the control group? In order to correct for this:

  1. We looked at historical instances of when overall YouTube activity had been at a similar impression volume to the volume reached by the test group.
  2. We calculated the difference in CVR between historical performance data and test group CVR.
  3. We applied this difference to the normalised test group impression volume to arrive at a more realistic CVR and conversion number.

Scaling up the test results using this method delivered a more accurate comparison than assuming conversions would scale proportionally to impression volume.

Applying our findings

Following our reporting methodology we arrived at final incremental add to basket and purchase uplifts for our client, applying weighted numbers between the Google test output and our test output to arrive at results that are slightly conservative. This way we can be 100% confident this calculation is driving real-world results:

Add To Basket VTC Incrementality


Purchase VTC Incrementality

These numbers have made an appreciable difference to how we have developed our YouTube strategy with our client going into Q2 2021 and beyond, and have defined value for YouTube audiences and creatives that would have otherwise remained unidentified..

Working with what you have

Getting to figures we and the client were happy with was a challenge. With the tools and testing methods available to us it required some deep thinking on both sides to understand the implications of the testing choices we made and how best to apply the outcome to reporting. But arriving at these results has made a real difference to confidence in effectively scaling activity, something we look to do for all our clients!