Choose your language

Choose your language

The website has been translated to English with the help of Humans and AI

Dismiss

How to Gain New Insights Through A/B Testing

3 min read
Profile picture for user mediamonks

Written by
Media.Monks

Two different colored walls

Everyone remembers the classic school experiment where two seeds are planted separately in different conditions: one kept in a dark cupboard and the other in the light. Everything was kept equal except one variant. This is a very simple example of A/B testing, which this article will seek to explain in greater detail.

Making greater use of data and analytics than with the simple (yet effective) example above, A/B testing can be used to measure the impact of a range of different tests, such as email marketing, PPC campaign testing, website content and more. 

But here, we’re going to discuss a very simple scenario. Let’s say we want to run a regional marketing test, and therefore wish to measure its impact to inform whether it’s worth running again in the future or not. But what exactly is A/B testing, and how can it help in this case? How should we go about running the test itself? And how do we actually measure its impact?

What is A/B testing?

A/B testing, sometimes otherwise known as split testing or test and control, is used to compare two versions of something and measure the impact from one element that is changed. In our regional marketing test example, we are looking to measure the uplift (if any) across the regions that the marketing ran, compared with the regions where it didn’t.

How do you run an A/B test?

Start with the hypothesis to be tested. For example, you believe a certain piece of marketing will outperform the norm. This could be prompted by a question, such as “Why isn’t a particular channel working?” or “We’d like to explore this strategy in the future, how can we test if it works?” The latter fits well with our regional marketing example.

From here, make sure you have the data required to measure the test and set it up in the correct way. For example, if running a regional test, ensure you are able to collect regional/store level data. And be sure to discuss the test with the relevant parties to ensure that the spend, flighting and regions chosen for the test are sufficient. 

At any one time, there will be a whole range of factors that influence sales. However, keeping everything the same except for the marketing that is being tested, we can confidently assume that any uplift is coming from the test itself.

How do you measure A/B test results?

First, we have to collect all required data. In our example of regional testing, split the sales data into two distinct sets: a test set and a control set. Collecting store level sales data would inform the split of sales data in our case. 

Next, we must index the test and control sets to have the same baseline so that a fair comparison can be made. Given that they may be of different absolute values, this is the best way of comparing the two when looking at the change in sales.

As mentioned previously, by keeping all other factors the same (i.e. availability of product, no differences in offering between stores, etc.), any difference seen in the test set can be attributed to the marketing activity we have run. See the chart below for a nice visual example of this.

What A/B testing achieves.

A/B testing allows us to calculate ROI more accurately than we would be able to otherwise, for example by modelling total national sales with traditional MMM. Store level modelling is another solution, but for smaller ad-hoc tests it is often not cost efficient due to the resource required and relative spend behind the test.

A/B testing doesn’t just provide data-driven proof that a test has or hasn’t worked. It also provides the opportunity to apply insights into consumer behavior across other areas of your marketing. If a regional test has worked when marketing a particular product, then it may well work again by using a different channel.

Carrying out regular testing works best, as it helps to reach the optimal marketing spend and laydown quicker within each channel, making for more fruitful marketing. And of course, similar to our favorite seed experiment, shedding more light on marketing will only help its impact grow! Learn more about how we can help you now.

Related
Thinking

Make our digital heart beat faster

Get our newsletter with inspiration on the latest trends, projects and much more.

Thank you for signing up!

Continue exploring

Media.Monks needs the contact information you provide to us to contact you about our products and services. You may unsubscribe from these communications at any time. For information on how to unsubscribe, as well as our privacy practices and commitment to protecting your privacy, please review our Privacy Policy.

Choose your language

Choose your language

The website has been translated to English with the help of Humans and AI

Dismiss