The Maxymiser blog has moved!

You should be automatically redirected in 6 seconds. If not, visit
and update your bookmarks.

Friday, May 04, 2007

When A/B and Split Tests aren’t enough – go Multivariate

A/B and split testing are great methods for working out which of two variations on some content will generate the greater sales, registrations or some other key metric. Anyone who has optimised a Google Adwords campaign to any level will be familiar with the concept – you serve two content options to your visitors and find out which one attracts the right leads or makes people most likely to buy.

Maxymiser offers the ability to perform A/B testing on web page content; it’s the entry level to our technology. However, to really learn what the optimal content is for your site, you need to be testing more than one area of the homepage.

Imagine we wanted to optimise enquiries from the Maxymiser website. We could perform a test to assess the impact that 4 different banner images have on quote requests and also the affect of replacing the ‘Segmentation’ text with a diagram. This gives us an A/B/C/D test and an A/B test as shown in the illustration.

Running these two tests at the same time using a feature available in some CMS solutions would give invalid results. This is because we would be measuring the response based only on one of the variables. People who saw Banner A might convert better than those who saw Banner B but without data on which variant of the A/B test lower down the page that they saw, any conclusions aren’t valid. People don’t make decisions based just on one variable, both will contribute so testing multiple areas in this way is flawed.

You could run the tests one after another but again, you could miss an uplift generated by a particular combination of content. For example, you may test the 4 banner variants first and find out that C gives the greatest uplift at 34%. The next month this is implemented and the lower area is now tested with variant B winning with 15% uplift. This approach doesn’t tell you how people would have responded to variant B with the other three banner variations. This is a critical flaw, in our experience it is never the winners of each individual areas that combine together to give the best overall page, therefore the winners of each individual area do not give the greatest conversion uplift.

Another option would be to combine the two tests into one. This gives an A/B/C/D/E/F/G/H test (2 x 4 variants). Adopting this approach does give results that reflect all of the possible combinations a visitor could see but it is time consuming and complex to code if you are not using a tool that can simply alter front end content without re-writing pages.

A more effective option is to make this a Multivariate test using simple technology to implement. This will minimise the technical resource needed for setup and deliver reports on which combination of content as well as which individual element provides the best uplift in quote requests. By working with a company such as Maxymiser, you don’t need to worry about sample sizes, tracking response and test setup as our technology handles this and give you clear reports at the end of the process so that you can report uplift within your organisation and apply the results.

With Continuous Optimisation, you can even allow our tools to monitor the test 24/7 and once it becomes clear that one variant is outperforming all others, it will be shown with greater frequency. This way, the site optimises itself without any human intervention, limiting the downside of testing, pushing more visitors to optimal variants as they emerge. On a technical front, Maxymiser requires very little technical resource to implement, your webpage simply needs some java script tags inserting that allow our server to insert content into the defined ‘Maxybox’ test areas and track visitor response. It really doesn’t take much more than half an hour of a techie’s time!

We will be performing some live multivariate testing on our own homepage later in the year to give a demonstration of how our technology works, in the mean time take a look at one of our many clients while we test them. Stay tuned for more.

No comments: