The Maxymiser blog has moved!

You should be automatically redirected in 6 seconds. If not, visit
http://www.maxymiser.com/news-and-events/blog/
and update your bookmarks.

Tuesday, December 15, 2009

Why do marketers who engage external expertise gain more from testing?

Recent research by Forrester shows that marketers who engage external expertise to plan and execute multivariate tests benefit from higher levels of uplift. Our own experience has also shown that those marketers who engage a full service model will tend to receive more uplift than those who do not. The difference can be up to threefold.

"…organizations that leverage a managed service model achieve greater gains. On average these gains are two percentage points higher, which could equate to millions of dollars for conversion-based Web sites."
The Online Testing Vendor Landscape, Forrester Inc. October 2009

Is it simply the case that marketers don’t get as much testing done when they don’t have someone external to proactively drive the project forward?

We looked into it but found there was no significant difference in the actual number of tests completed so it must be something about how the tests are run that makes all the difference. To explore this in a bit more detail and see whether we could pin down a reason for the difference in results from self-managed testing compared to engaging external expertise, we ran an experiment of our own.

Earlier this month, we invited marketers to take part in an experiment. We showed the before and after pages from a multivariate test where the differences were fairly subtle. Marketers were asked to guess which bracket they thought the uplift in conversion rate would sit within.

Only 9% of our sample got it right. It’s not hugely scientific but what is interesting to note is that the remaining 91% of marketers under-estimated the impact that this test would have on the bottom line. 31% of marketers suggested the uplift would be in the region of 0-10% when the actual uplift in clicks to booking page was 30%.

To be effective, a multivariate testing campaign needs to test the right hypotheses and in the right order. Knowing what to test and in what order is key to achieving results and answering questions and fast. 91% of marketers under-estimated the importance of the position of price and call to action buttons on this checkout page. Identifying the subtle changes which will have a large impact on critical pages in the funnel is vital to achieving good results.

We’d suggest therefore that much of the value of engaging external expertise in planning and executing tests arises from being able to take a strategic view of the testing process, identifying the most effective places to test and then testing the changes most likely to make a difference.

Marketers who overlook the process of determining what tests are most effective do so at their peril. By underestimating the impact of apparently subtle changes, a significant amount of benefit is lost. Multivariate testing entered into on an ad-hoc basis simply does not generate the uplift that can be attained by knowing what to test, where and when based on a wealth of expertise and past experience.

Tuesday, December 08, 2009

Can You Spot the (Profit) Difference?

In March we challenged you to guess the winning variant from a multivariate test. Over 500 marketers took part and we were surprised that only 4.6% got it right!

We’ve launched another challenge with a new twist and we’d like to invite you to have a go in order to test some more preconceptions. We will share some anonymous results afterwards so you can see how you did against your peers.

What impact do you think these very subtle changes had on our client’s conversion rate?

Play Spot the (Profit) Difference

This is designed to help you think about your web content and ways in which it can be managed to proactively improve conversion metrics.

Good luck!