The Maxymiser blog has moved!

You should be automatically redirected in 6 seconds. If not, visit
and update your bookmarks.

Tuesday, December 15, 2009

Why do marketers who engage external expertise gain more from testing?

Recent research by Forrester shows that marketers who engage external expertise to plan and execute multivariate tests benefit from higher levels of uplift. Our own experience has also shown that those marketers who engage a full service model will tend to receive more uplift than those who do not. The difference can be up to threefold.

"…organizations that leverage a managed service model achieve greater gains. On average these gains are two percentage points higher, which could equate to millions of dollars for conversion-based Web sites."
The Online Testing Vendor Landscape, Forrester Inc. October 2009

Is it simply the case that marketers don’t get as much testing done when they don’t have someone external to proactively drive the project forward?

We looked into it but found there was no significant difference in the actual number of tests completed so it must be something about how the tests are run that makes all the difference. To explore this in a bit more detail and see whether we could pin down a reason for the difference in results from self-managed testing compared to engaging external expertise, we ran an experiment of our own.

Earlier this month, we invited marketers to take part in an experiment. We showed the before and after pages from a multivariate test where the differences were fairly subtle. Marketers were asked to guess which bracket they thought the uplift in conversion rate would sit within.

Only 9% of our sample got it right. It’s not hugely scientific but what is interesting to note is that the remaining 91% of marketers under-estimated the impact that this test would have on the bottom line. 31% of marketers suggested the uplift would be in the region of 0-10% when the actual uplift in clicks to booking page was 30%.

To be effective, a multivariate testing campaign needs to test the right hypotheses and in the right order. Knowing what to test and in what order is key to achieving results and answering questions and fast. 91% of marketers under-estimated the importance of the position of price and call to action buttons on this checkout page. Identifying the subtle changes which will have a large impact on critical pages in the funnel is vital to achieving good results.

We’d suggest therefore that much of the value of engaging external expertise in planning and executing tests arises from being able to take a strategic view of the testing process, identifying the most effective places to test and then testing the changes most likely to make a difference.

Marketers who overlook the process of determining what tests are most effective do so at their peril. By underestimating the impact of apparently subtle changes, a significant amount of benefit is lost. Multivariate testing entered into on an ad-hoc basis simply does not generate the uplift that can be attained by knowing what to test, where and when based on a wealth of expertise and past experience.


Chris Goward said...

Thanks for this, Eric! We'll try this method out on the next multi-goal test.

Chris Goward said...

That research isn't surprising, Mark. In our experience, Execution of A/B/n and Multivariate testing is just as important as the Consulting component.

When we began WiderFunnel we would use the client's internal design and copywriting resources and it rarely turned out well. There's a diverse set of unique resources that are rarely found within a client company.

For example, WiderFunnel spends a lot of time training our designers and copywriters by looking at thousands of test results to learn what actually lifts conversions.

There are few designers that are trained for Conversion Optimization like this. In fact, most trained designers will resist our advice initially (until they see the results, of course).