Running your first A/B test

Acquiring new customers is expensive, so it shouldn’t be left to guess work. A/B testing gets more out of your existing traffic, using data to identify postive and negative changes you make to your website.

Every website has a goal. Online shops want visitors to buy products, software providers want people to sign up to their service, blogs want people to subscribe and so on. People completing these goals are your conversions. By testing a different version of your web page, you can measure how many more (or less) of yours visitors convert based on these changes.

I will show you how to run your first A/B test, and get actionable data that increases your conversions.

 

What do you want to test?

You can test anything on your website, however that doesn’t mean you should. Rather than diving straight in and making ad-hoc changes, you will get better results by applying a methodical approach.

A good framework to follow:

1. Ask yourself a question about something you don’t like about how your website performs e.g. Why don’t I get many sales of my 2nd tier product? Even better, if you have an active customer/visitor base, ask them what they would like to see changed.

2. The next step is to create a hypothesis based on this information e.g. Adding a ‘Most popular’ badge to my 2nd tier product in the product table will draw more sales

3. Once you know what you want to test you now need to decide on how long you want the test to run. There is a handy calculator using your current conversion rate, the level of improvement you want to see, the number of variations you are running and your daily visitor count at convert.com.

4. Now we get to test your hypothesis by running the A/B test for the planned length of time.

5. Once the A/B test is complete analyse your results. You’re A/B testing software (more on those later) will tell you which is the winning version. If it is your new page, make that your new default!

Common A/B testing mistakes to avoid

A/B testing sounds simple; create two versions of a page and see which is better. However, there is  a lot more to it than that, especially if you don’t want to waste time testing changes that have next to no impact. Here are common A/B testing pit falls to avoid…

Testing too many things at once

If you make too many changes to a page you won’t know which are having a positive impact and which are having a negative impact. Multivariate testing will help you test a few ideas at once, but this relies on a lot of traffic to be statistically valid.

For A/B testing, make one change and test the impact on your conversion before moving on to your next test.

Thinking too small

Micro-changes such as amending some copy here, or adding a link there, won’t have a significant impact on the performance of your web page. You need to make big changes to see big results.

For example, compare a short form version of your page with a long form, compare two completely different header images (e.g. one with a person and another of the product) etc.

Once you have exhausted the big ideas, then you can start to drill down to finer details to optimise even further.

Focusing just on clicks

The goal of every test should be to increase conversions, not just clicks on a button or link from one page to another. Clicks on links matter, but only in the context of how that contributed to an uplift in conversions.

Every test should be judged against how many conversions it generated.

Overruling your results

It is common to get results that surprise you or go against everything you thought was true.

When this happens it easy to dismiss them and continue with the old design. If you are unsure of the data, as a minimum run the test again. If it comes back positive a second time, trust the data.

A/B testing case studies to inspire you

Having the framework to run an A/B test is one thing, knowing what to test is another. Here are some classic A/B test case studies to help you get some ideas.

A/B testing software

I have personal hands-on experience of using Visual Website Optimizer and Optimizely. Of the two I found Optimizely to be easier to set up and analyse my results. Google offer a basic A/B testing tool called ‘Experiments‘ through Google Analytics, but I haven’t used it so I can’t honestly comment on its qualities. Seeing as it is free, I would recommend having a look if you are on a tight budget.

Most providers offer a free trial period, so I would recommend shopping around to find the one that suits you and your budget.


Leave a Reply

Your email address will not be published. Required fields are marked *