Blog Post Labels
ab testing (73) multivariate testing (68) conversion rate optimization (67) quick wins (64) Google experiments (10) Maxymiser (7) VWO (4) check-out (4) Seasonality (3) Visual Website Optimizer (3) campaign (3) culling (3) mvt (3) split testing (3) A/B testing (2) CRO (2) Google (2) Google Analytics (2) Google Trends (2) Gutenberg (2) HiPPO (2) MVT testing (2) Magazines (2) SEO (2) Statistical Significance (2) Twitter (2) amazon (2) basket (2) calculating uplift (2) compare (2) mobile (2) page fold (2) wilkinsonplus.com (2) $1 (1) 99p (1) Aviva (1) BlinkBox (1) Digital Content Summit 2015 (1) Diminishing Returns (1) Eye tracking (1) Increasing average basket value (1) Keywords (1) Map Overlay (1) Native app (1) Netflix (1) Omniture (1) Optimizely (1) Page rating (1) Personization (1) Testimonials (1) VCB (1) Viking (1) Website redesign (1) analysis (1) browser size tool (1) checkout (1) conversion rate optimisation (1) cross-selling (1) data alchemy (1) discount code (1) facebook (1) fatwire (1) google browser size tool (1) heatmaps (1) iOS (1) iTunes (1) insights (1) landing page (1) marketing (1) net.finance 2011 (1) neuromarketing (1) non-financial KPIs (1) paywall (1) popular posts (1) presentation (1) promo code (1) reptilian brain (1) retail (1) segments (1) session capture (1) session recording (1) short wave testing (1) whereisthefold.com (1) £1 (1)
Thursday, 28 October 2010
Google Optimiser or similar. We've done loads of testing in the past on buttons, testing colours, sizes, Apply text and so on, but I read an interesting article from Get Elastic on how unusual button designs can give you an easy uplift in conversion. So I tried over the course of a couple of months on a landing page testing all the designs you see here. No.1 was the default design, and the winner was...No.3 the 'boxed arrow' design with a 32% uplift in click to apply rate. The arrow-based designs were in the upper end of the winning designs overall, but the, *cough* phallus-based designs stole an early lead but didn't win out overall. Give it a go on your website, it's quick easy and surprisingly fun.
Thursday, 21 October 2010
First off - What is a follow-up experiment?
And why should I run a follow-up experiment?
The screenshot below shows the original MVT test results.
I commenced a follow-up test running the the winning variant from this test in a head to head with the original default. And this is what happened...
The blue line is the original design beating the first test winning variant. This has happened time & again with my follow-up experiments. Then I noticed something. When you set up a follow-up experiment it's easy to overlook the weightings setting or the 'choose the percentage of visitors that will see your selected combination' option of a follow-up test. By default it's set to 95% for your selected combination.
Now I cant offer any explanation but from previous testing with other tools such as Maxymiser we've seen when you up-weight a particular variant in a test in favour of another, invariably it's conversion performance goes down, sometimes radically so. I recommend only doing a 50-50 weighting at anytime in any follow-up experiment because for whatever reason an unequal weighting seems to skew performance. Just be aware of this possibility and you'll be fine : )
If anyone can offer me a scientific explanation for this behaviour I'm all ears!
By the way, below shows the test after the weightings are reset to a 50/50 split. Bit different from the original follow-up experment no?
Thursday, 14 October 2010
It's good to see some examples of AB testing that aren't just about which web page works best. And here's another example of Magazine cover split testing. This months issue of Company magazine is running with two variations of it's cover, one with presenter Fearne Cotton, the other with presenter Konnie Huq (who's married to this guy by the way). As you can see both covers are the same bar mention of the featured presenter and the hero shot; even the poses are almost identical. I've mentioned magazine cover testing before in a previous article here but I havn't seen any recent evidence of the Press engaging in this kind of marketing test in recent years. Would be (mildly) interesting to see who wins this particular test.