Monday, 28 March 2011

The conception of Short Wave Testing

Right, well this is really work in progress. I think I've invented a new form of multivariate testing on the web. And for clarity this has nothing at all to do with Short Wave Radio. However, a couple of points to start off with; A) I'm not entirely sure it hasn't been done before and B) It's a valid test methodology.

Well hang it this blog is all about being a testing 'Maverick' so here goes nothing....

First off, let's not get confused by Iterative Wave Testing as used by Optimost. I think I'm right in saying that's where you test the same variants over a sustained period in 'waves' of testing to ensure what you have is validated and statistically significant. All very worthy, good stuff.

What I've been experimenting with is trying a set of test variants in one brief wave of testing and then ditching or culling any negative or lesser performing variants in favor of an entirely new variant in a new wave of testing that sees the positive or successful variants carried forward from the last wave. The whole process is repeated for as many waves as it takes to get a robust set of variants that out-perform everything else pitted against them. The only qualifying criteria for a variant to be carried forward to the next wave of testing is that they either continue to outperform the original default design or better the performance of anything that has gone before them, i.e; anything that has been previously removed.

I hope this simple (ish) diagram illustrates how this short wave testing works. Below we have 4 test areas in a web page and we have 4 phases of testing. As we can see in  Test Area 1, Variant A is successful enough never to be culled from the test and ultimately becomes the winner for Test Area 1.  Test Area 2 shows an initially unsuccessful Variant A that is culled after the first phase of testing and replaced with a new variant B which goes on to be the winning variant of Test Area 2. Test Area 3 has a different story, in the end it takes 4 different variants over 4 phases of testing to find a variant that is positive enough to be declared a winner. And Test Area 4 arrives at a winner on the third phase of testing with variant C.

Now I'm aware that this form of testing is both labour intensive and resource-heavy in it's undertaking. I was able to do this kind of testing because I was both motivated enough to dedicate resource to it and had enough ideas in the locker that I wanted to test for each test area and test wave. I used Google Optimizer to do it and coded the variants myself and the outcome has been, well staggering. A sustained uplift in the region of 18% for product purchase has been achieved (a personal best BTW)  and to me I am reasonably confident in the results because the final variants I had, had reported  consistently the same uplift over 9 separate waves of testing.
What I'm hoping for now is the counter-argument from my testing peers (drop me a line at I'm aware of the shortcomings of this approach but want others to have their say on this kind of testing methodology. Here's my bonfire, feel free to piddle all over it : ) Happy Testing!

UPDATE: One thing worth noting with this testing approach is that if it goes right your conversion rate for the test variants should improve for each wave where you attain, keep or build on positive performing variants but at the same time you will also see a diminishing uplift for each wave. This is because you are continually testing against improved and stronger performing variants in the test segment. Ultimately though you should still see a good uplift against the underlying original default design.

Friday, 25 March 2011

No need to shout about it

I've been running an MVT test on a comparison page and recently introduced an 'Ends Soon' label next to the product call to action. Initially the presence of this message was negative. I resized the image by half making it much smaller and the conversion results are much improved, illustrating that sometimes people just don't want to be shouted at : )

Update: Although this 'hurry message' didn't work well with this particular page which was a product comparison page the same image used on an already optimized product page using  Maxymiser has led to a 44% uplift in product application submit rate.