Last week I came across an item on the Netflix Tech Blog that showed a slide and presentation of their plans for a/b testing their new web interface running on Node.js
It was all very worthy stuff and I wasn't entirely sure either what they were doing or trying to achieve by this testing; the body of the content spent most of the time waxing lyrical about not having to touch they're underlying system platform. I guess this is a big deal for Netflix! Anyway as an avid Netflix user I believe I have witnessed their testing strategy firsthand but in a rather perplexing fashion. Below is a screengrab of what I'm talking about. Under documentaries recommendations I frequently see at least two listings of 'The Long Way Down'. These are for the same programm/episode etc but one is shown in some weird Instagram effect the second in a monochrome colour with a variation in image.
Now this is either a mistake or an entirely new way of a/b testing. I've never seen test variations (if that's what they are) presented side by side before. If this is a legitimate test is it not a rather crude means of promoting an episode to the end user? What is the end goal to see whether fans of Instagram filters opt for one creative over another? Is the 'Long Way Down' just such unmissable viewing entertainment that it warrants mentioning A LOT! Answers on a postcard. Happy testing : )
You can hire my services
I am Ben Lang an independent web conversion specialist with over 20 years of experience in IT and Digital and 12 years Conversion Rate Optimization (CRO) know-how.
I provide a full analysis of your website conversion performance and the execution of tried and tested CRO optimization exercises through AB testing, split testing or MVT (Multivariate testing ) deployed to fix your online conversion issues.
Contact me at https://www.benlang.co.uk/ for a day rate or catch up with me on LinkedIn
Blog Post Labels
ab testing
(73)
multivariate testing
(68)
conversion rate optimization
(67)
quick wins
(64)
Google experiments
(10)
Maxymiser
(7)
VWO
(4)
check-out
(4)
Seasonality
(3)
Visual Website Optimizer
(3)
campaign
(3)
culling
(3)
mvt
(3)
split testing
(3)
A/B testing
(2)
CRO
(2)
Google
(2)
Google Analytics
(2)
Google Trends
(2)
Gutenberg
(2)
HiPPO
(2)
MVT testing
(2)
Magazines
(2)
SEO
(2)
Statistical Significance
(2)
Twitter
(2)
amazon
(2)
basket
(2)
calculating uplift
(2)
compare
(2)
mobile
(2)
page fold
(2)
wilkinsonplus.com
(2)
$1
(1)
99p
(1)
Aviva
(1)
BlinkBox
(1)
Digital Content Summit 2015
(1)
Diminishing Returns
(1)
Eye tracking
(1)
Increasing average basket value
(1)
Keywords
(1)
Map Overlay
(1)
Native app
(1)
Netflix
(1)
Omniture
(1)
Optimizely
(1)
Page rating
(1)
Personization
(1)
Testimonials
(1)
VCB
(1)
Viking
(1)
Website redesign
(1)
analysis
(1)
browser size tool
(1)
checkout
(1)
conversion rate optimisation
(1)
cross-selling
(1)
data alchemy
(1)
discount code
(1)
facebook
(1)
fatwire
(1)
google browser size tool
(1)
heatmaps
(1)
iOS
(1)
iTunes
(1)
insights
(1)
landing page
(1)
marketing
(1)
net.finance 2011
(1)
neuromarketing
(1)
non-financial KPIs
(1)
paywall
(1)
popular posts
(1)
presentation
(1)
promo code
(1)
reptilian brain
(1)
retail
(1)
segments
(1)
session capture
(1)
session recording
(1)
short wave testing
(1)
whereisthefold.com
(1)
£1
(1)
What is MVT testing?
In it's simplest form MVT or multivariate testing is where you test alternative experiences of an existing webpage against the current page design or user journey. Unlike AB or Split testing where you test one webpage design against another, in MVT you test a combination of page elements like alternative page copy, headings, images, buttons and so on all at once to see which specific combination of alternative designs work best in driving page visitors or customers towards an end goal of your choosing.
All testing, regardless of whether its an AB or MVT test requires two things. Time and Traffic. By that I mean you need to give a test enough time to establish what's called statistical significance, to basically reach a stable conclusion, and you will also require a volume of website traffic to churn through your test experiences. Typically you need less traffic to run through an AB test which normally only has a couple of alternative page designs to get through. In an MVT test you will typically have a higher number of test combinations to get through and therefore will require enough traffic to 'feed' your test.
In all test situations you continue to send a proportion of traffic to your existing page or default page to benchmark your alternative experiences against it. This is where you are calculationg any uplift in conversion, be that click through rate or the number of people getting to and passing through your check out process.
Testing outcomes can be positive or negative. A positive outcome might be that you test a red check out button versus a blue check out button and the red button delivers 8% more purchases. A negative outcome might be that the same button delivers a -5% drop in purchases. However, either way you have a valuable learning that tells you what changes to make to your site or inform further testing.
All testing, regardless of whether its an AB or MVT test requires two things. Time and Traffic. By that I mean you need to give a test enough time to establish what's called statistical significance, to basically reach a stable conclusion, and you will also require a volume of website traffic to churn through your test experiences. Typically you need less traffic to run through an AB test which normally only has a couple of alternative page designs to get through. In an MVT test you will typically have a higher number of test combinations to get through and therefore will require enough traffic to 'feed' your test.
In all test situations you continue to send a proportion of traffic to your existing page or default page to benchmark your alternative experiences against it. This is where you are calculationg any uplift in conversion, be that click through rate or the number of people getting to and passing through your check out process.
Testing outcomes can be positive or negative. A positive outcome might be that you test a red check out button versus a blue check out button and the red button delivers 8% more purchases. A negative outcome might be that the same button delivers a -5% drop in purchases. However, either way you have a valuable learning that tells you what changes to make to your site or inform further testing.
The Gutenberg rule - revisited
In a previous post about the Gutenberg rule I mentioned that we'd achieved some good AB and MVT testing results using this design principle. The principle basically works off the theory that humans subconsciously scan a print or web page from top left to bottom right and then loop back up the page. So allowing this principle to manage your page layout seems to have become the norm in web design placing buttons and other key calls to action in the 'fertile' areas, typically bottom right.
Increasingly though I notice that more and more people are breaking from this practice whether through test learnings or just asthetic decisions. An example being Aviva.co.uk .As you can see below their landing page for car insurance aligns much of it's content to the left of the page. Having become comfortable with placing the onward journey point at the bottom right this just jars. Having said that it's important to challenge the norms to see what resonates with the customer. I may have to test this layout myself to check we're not missing a trick.
Happy testing : )
Increasingly though I notice that more and more people are breaking from this practice whether through test learnings or just asthetic decisions. An example being Aviva.co.uk .As you can see below their landing page for car insurance aligns much of it's content to the left of the page. Having become comfortable with placing the onward journey point at the bottom right this just jars. Having said that it's important to challenge the norms to see what resonates with the customer. I may have to test this layout myself to check we're not missing a trick.
In the meantime here's a mock-up of an alternative design I would test based on established findings.
Page fold - browser size tool revisited
Problem: Recently Google have moved their Browser Size Tool into Google Analytics. For me and many other sites this means it simply no longer works as the 'In-page' reporting showing heatmaps and so on doesn't work or execute with our lovely CMS.Boo!
Solution: There's a free alternative. Yay! Called whereisthefold.com.
This is what it looks like on this blog.
Solution: There's a free alternative. Yay! Called whereisthefold.com.
This is what it looks like on this blog.
a/b testing tools comparison by popularity
It's also interesting to see so many new companies and start-ups there are in the CRO business since I last did this audit back in 2010 and also how many providers are no longer around. It's a fierce and competitive business....
So here's what I've found as at August 2014. Sources: Alexa.com, Google Adwords and Own Vendor sites
updated 13th Oct 2014
Reach ranking | Vendor | Alexa volumes YTD | Avg. monthly searches for product | No. of publically listed clients |
1 | SiteSpect | 381,175 | 98,600 | 55 |
2 | adlucent | 324,719 | 27,000 | 0 |
3 | Google Experiments | 10,806,000 | 802,000 | 0 |
4 | Conversion multiplier | 3,987,218 | 73,600 | 0 |
5 | Global Maxer | 3,210,851 | 51,000 | 0 |
6 | Conductrics | 2,333,087 | 14,700 | 0 |
7 | Qubit | 1,506,519 | 25,880 | 131 |
8 | Avenso | 1,020,946 | 1,030 | 32 |
9 | Clickthroo | 974,000 | 47,819 | 0 |
10 | Adobe Test and Target | 945,671 | 78,000 | 0 |
11 | Visual Website Optimizer | 908,000 | 13,102 | 0 |
12 | Accenture | 860,000 | 13,240 | 0 |
13 | Webtrends | 763,000 | 19,600 | 0 |
14 | Hi Conversion | 722,004 | 47,000 | 44 |
15 | Get Smart Content | 673,173 | 3,740 | 14 |
16 | Convert | 672,000 | 126,000 | 0 |
17 | Taplytics | 283,016 | 12,789 | 6 |
18 | Optimizely | 239,000 | 58,898 | 25 |
19 | Maxymiser | 149,697 | 6,980 | 78 |
20 | Site Tuners | 121,678 | 13,000 | 72 |
21 | Autonomy | 120,271 | 26,000 | 0 |
22 | AB Tasty | 93,689 | 27,000 | 31 |
23 | Monetate | 36,501 | 21,230 | 39 |
24 | Unbounce | 22,230 | 12,230 | 0 |
25 | Hubspot | 2,700 | 541 | 0 |
26 | Genetify | 0 | 6,830 | 0 |
27 | Vanity | 0 | 6,400 | 0 |
Happy testing : )
Subscribe to:
Posts (Atom)