You can hire my services

I am Ben Lang an independent web conversion specialist with over 20 years of experience in IT and Digital and 12 years Conversion Rate Optimization (CRO) know-how. I provide a full analysis of your website conversion performance and the execution of tried and tested CRO optimization exercises through AB testing, split testing or MVT (Multivariate testing ) deployed to fix your online conversion issues. Contact me at https://www.benlang.co.uk/ for a day rate or catch up with me on LinkedIn

Riding the tsunami - optimising an online marketing campaign



In the last three months we've taken web optimisation in a new direction. To date we've been trying, (emphasis on 'trying') to follow the essential principles of mvt and split testing. Putting in test content, wait a period of time, wait for visitor volume to do it's work, seeing what works in getting more customers to convert, seeing what doesn't, etc, etc. This is all well and good and will always be the core of what we do, but recently I've decided to just go hell for leather on getting an uplift and cashing in on an already positive proposition.

Our bank pioneered the cash switching incentive. This is where you can switch current accounts and the bank (us) will give you £100 cashback for the pleasure. This has been an extremly good promotion for us and we will routinely serve this promotion on the website to drive current account applications, and boy does it work. I cant give you the monetary figures for obvious reasons, but there's a very good reason other highstreet banks are now offering this same incentive. Anyway, after seeing the sudden peak in traffic that you get to the site during the campaign I decided to perform an optimisation test during the promotion period on the customer journey or offer page. I basicaly came up with half a dozen alternative page designs that focus on the promotion and were ran directly against the default page design using our normal testing tool (Maxymiser). Because of the nature of the promotion period it is definitly not a conventional MVT or split test. I had no intention of seeing what would emerge as the outright winner over time I just wanted to serve the best performing content for that moment in time. So you run the 'test' and due to the massive traffic volumes you can just sit there and watch the customers come in and crank through the test variants very quickly. You soon see what works and what doesn't. I then engage in some aggressive culling of the negative performing variants.

So after an initial trial I already know what the best page design is for this particular campaign and will serve this content again during the next campain period, knowing it will be sucessful and do it's job. This means I can just challenge this 'champion' at my leisure. But in terms of results the first run gave me a 44% uplift over the default page design in getting people to apply for the product. Wow!

UPDATE: 15th August 2011.
Coming late to the party, but getting there in the end are Get Elastic ( a very good web optimisation site) endorsing this approach (after a bit of soul searching) with an article titled 

Should You Avoid Testing During a Traffic Spike?

Optimization of a product page


Like most highstreet banks we have a Loans product page, but we hadn't carried out any kind of optimization activity on this page, or this product for that matter. Generally speaking the thing that sells a loan is the rate (natch). But we have found that in terms of what visitors want to see on a Loans product page is the ability to use a loans calculator to see what their repayments are likely to be etc.
Therefore the following test was planned and conducted...

Objective – To get more people to use the Loans Calculator and more people to apply for the Loan product and submit the application form.

The test was ran continuously for 27 days between the dates of 21/08/2009 to 16/09/2009.

Test audience – 100% of traffic to Highstreet Bank Loans Hub page.

Hypothesis
– By testing alternative page designs on the Loans Hub page we expected to see an improvement in the visitor to apply ratio by 3% and improve our visitor to submitted application ratio by 5%. We also hoped to see an improvement in visitors using the calculator by 3%

Test Results Summary

  • 9.08% improvement in visitor to apply ratio
  • 4.59% improvement in the submitted application ratio
  • 4.12% improvement in visitors using the calculator
  • 5.06% improvement in product accept rate

The existing page design was as follows (note the placement of the loans calculator)...

The winning page design was as follows...



Put into context
To put this improvement into context, if all of the 126439 visitors during the test period had been shown the new design :

126439 Visitors to winning design x 16.17% visitor conversion x 33.2% accept rate during test = 6787 accepted applications

126439 Visitors to default design x 15.17% visitor conversion x 31.6% accept rate during test = 6061 accepted applications

This equates to a 11.9% increase in accepted applications

Findings


Bringing the loan calculator and apply now button front and centre focuses a visitor on the primary calls to action; perform a calculation, click apply.

The static right-hand promo tile whilst previously used to host the loan calculator was now used to promote creative that played on 3 core themes; ‘cash’, ‘car’ and ‘home improvement’. We wanted to see which image and theme appealed to the most loans customers. In the past we have used a variety of images in the loans section that appealed to these areas, this test would go some way in answering which theme had the widest appeal.

We’re aware that images can be largely subjective in either turning people on or off but using a very basic image that illustrated a single concept provides a jump-off point for enhancing the ‘home improvement’ message relating to a personal loan application.

How to increase average spend - The 97p Shop


Cross selling and add-on sales are nothing new. If you visit Amazon often enough you'll be more than aware of the best in class method of cross selling, they are the masters. Wish lists, gift lists, recommends, you name it, they have it. Anyway something caught my eye the other day when I was visiting wilkinsonplus.com. They've introduced the 97p shop which essentially offers a range of items for, yes you've guessed it 97p. They promote this shop right across the customer journey.

1.



2.



3.





By the time you've selected an item from their main range and you are at the checkout your already primed to top-up your purchase with an item from the seemingly low cost range. Most people wont blink at spending an extra 97p on the average transaction and this particular price promotion will easily fly under most shoppers cost radar. The growth and success of retailers like Pound Stretcher on the UK highstreet during this economic downturn also adds weight to this particular promotion. We've already adjusted our psychological perceptions of the value of a £1 and how far it should go. Wilkinsons are effectively taking on these established budget shops by playing them at their own game. A similar approach can be seen if you visit your local Tescos where they've introduced budget ranges that compete directly with the likes of Lidl and Aldi. If Wilkinson.com make a success of this campaign (as I suspect they might) they could end up effectively owning the 97p value in customers minds and will be an endemic part of their brand

Eye Tracking


I recently had the opportunity to witness Eye Tracking research on our websites. This was undertaken at a company called Foolproof in London. Eye tracking is the process of measuring either the point of gaze ("where we are looking") or the motion of an eye relative to the head. Eye tracking relating to website usage is a method of capturing eye movement relating to web page design, content, layout and it's usage. Eye trackers are used in research on the visual system, in psychology, in cognitive linguistics and in product design.

With regards to this particular session you get to sit in a viewing suite, on comfy sofas, while people next door are monitored whilst they conduct a customer journey on your site. Their eye movements are tracked and you can see their monitor in front of you projected onto a big screen. The image below is of this event captured rather badly on my mobile camera (apologies for the terrible quality) Bottom left is the interviewee on a video feed and on the right is a projection of their monitor with eye tracking overlaid in realtime.




It's all very interesting. You do get a real insite into what people genuinely experience in the real world. There were many learnings to be had from this session. Too many to repeat here. One key one for me however, was how users view our '3 Box Wonder page' design. This is a page design we arrived at as an outcome of extensive multivariate tests (see image below). It's an unpopular design here in ecommerce but it performs consistently well in getting customers to convert. The eye tracking finding suggested that the reason it may be successful is that the negative points relating to the product are contained within the 3 stacked boxes to the right. These boxes happen to look like Google adword ads so are almost subconsiously ignored by the visitor. This is not something we had intended by this design but it certainly goes some way to explain why this design is so good at what it does! It also helps to prove the value of eye tracking too.

3Box Wonder...


Twitter tools (Part Two) - revised







Following on from my last Twitter related post about a few tools that caught my eye I'm adding a few more to the list (maybe not all 100% useful but show an interesting use of the Twitter API none-the-less):





1. http://tweetingtoohard.com/ - This site describes itself as a place 'Where self-important tweets get the recognition they deserve'. It's a listing of the most egotistical tweets/tweeters out there. Disturbing but fun.





2. http://tweleted.com/ - Thought that tweet had been deleted? Think again. Handy for looking up Gail Porters (@Gailporter) mobile number too.





3. http://twitturly.com/ - This site tracks and ranks what URLs people are talking about on Twitter. Looking up individual Tweeters URL postings is quite useful too.










4. http://dossy.org/twitter/karma/ - Similar to my previous recommendation of http://friendorfollow.com/ for graphically representing who's following who but in a much slicker interface.









5. http://twistori.com/ - A failrly basic, but slightly compulsive site for displaying live feeds, grouped by six simple concepts; love, hate, believe, feel and wish.



Read part one of this post (more Twitter tools)

Behind the paywall - Can you really charge for News content?




As MediaGuardian, New York Times and News Corp as a whole started making noises again about charging for their content it begs the question of whether this idea is at all workable? There seem to be too many barriers to this ever actually happening?

1. The BIGGY. Most web users can get the same information a click away for free and carry the mindset that the majority of web content is free. How can you change this ingrained expectation and behaviour?

2. The obvious scenario perhaps. Website X throws up a Paywall* and starts charging for access to it's cherished content. User Y subscribes to content, copies content and immediately disseminates content for free.

3. Advertising revenue. Advertisers would run to the hills. Why pay for placement on a site that will invariably attract/get less visitors than before? The exception to this has been FT.com.
Whilst they successfully charge their users high premiums for access to content they can expect to charge equally high premiums to advertisers. Those that do pay for digital content are a valuable audience, both in terms of the additional profile data from registrations and the simple fact that they are obviously willing to pay. But, and it's a big but, on the web they are still the exception not the rule.

4. The BBC. The BBC is as ever proving to be the eccentric case in this mix. They already have a Paywall in the sense that they charge the UK public licence fee £142.50 annually for it's service through it's Public Service Broadcasting remit. The online content continues to be funded through this and while this is the case how can other news sources possibly compete?

5. Supply outstrips demand. There are still more news sources than consumers on the web. A survival of the fittest invariably kills off the weaker brands over time but the biggest growth in news feed comes from the blogging community and social media.

6. Credit Crunch. As we continue to be gripped by the economic downturn how many of us really value our daily news injection from Website X, Y or Z over other more 'essential' financial obligations?

* Paywall - A website that restricts access to certain content only to paid subscribers.

UPDATE: 20th April 2011
This article in the Guardian today has probably given us a sneak peak of the future of Paywall content. Slovakia's media throw up a universal paywall :

Page Combination Removal Feature in Maxymiser

We use Maxymiser as our Multivariate/AB testing tool of choice. Maxymiser allows you to see how individual test variants are performing in it's console. In a previous post about culling variants I talked about how we like to actively remove under-performing variants from our tests. Well this has been a contentious issue, whether or not it's the right thing to do in a test scenario (see post for full discussion). However Maxymiser have recently added a new feature to their test console that allows you to remove under-performing page combinations* from your test. Doing this also allows you to immediately and clearly see the impact of performing such an action and its immediate effect on the remaining page combinations.

Below is a screenshot of our Maxymiser console displaying an active test before we remove an underperforming page combination. P8 page combination is highlighted as an under-performing page combination. It has an uplift of minus 1.76% and a 'Chance to Beat All' value of 25.23%.



Now after we exclude the P8 page combination from the test you can see what the results look like. The lead page combination P2 leaps from a 39% 'Chance to Beat All' to 51%, hence speeding up your test. The overall uplift value also moves from 3.19% to 3.59%.



So the introduction of the page exclusion feature actually allows you to experiment more with 'What If' scenarios. We like to think that this enhancement was made to Maxymiser as a direct response to our need but it's probably not the case!


* A page combination is a collection of multivariate test variations

My Twitter toolset - a few of the most original & useful tools out there






There are many listings of the plethoria of tools that have been developed for using directly or indirectly Twitter and it's many facets and features. I've decided to put together a list of what I think are the most useful of the ones I've seen & use regularly. Here's my top eleven (everyone does 10):



1. http://twittersnooze.com/ TwitterSnooze - I love this. You can pick people you are following who, shall we say, are a little over enthusiastic on the tweet front. You can 'rest' them from your news feed while not unfollowing them altogether. Genius really.



2. http://twitterholic.com/ Twitterholic - gives you the current list of who's most popular on Twitter and also let's you find out your own ranking. Rather surprisingly @stephenfry is ranked somewhere in the mid sixties at time of writing.



3. http://www.hashtags.org: “#hashtags “or 'Hash tags' – a word prefixed with the # symbol – are used by Twitterers to make it easier for people to follow key topics on Twitter. For instance, users who were tweeting about the recent Swine Flu breakout added #swineflu to their posts. hashtags.org provides an at-a-glance running ticker of popular hash tags, to help Twitter users keep up to speed with hot topics.



4. http://friendorfollow.com/ Friend or Follow - allows you to see who your following but who's not following you back (how rude). You can then manually unfollow each if you so desire or use the next tool in the Twitter arsenal...

5. http://socialtoo.com/twitter SocialToo - This tool allows you to do a host of different things automatically in Twitter. Some services, however you will have to pay for. Amongst it's features are the ability to automatically follow people who follow you and automatically unfollow people who don't follow you. I've signed up for this particular paid service as I think it's actually the most powerful Twitter application I've seen so far. You have to kick it off periodically and the script can take an eon to run but it does what it says on the tin, you'll log into your account after it's run and suddenly see you are following an equal number of people to the number who are following you.



6. http://tweeteffect.com/index.php Tweet Effect - This tool helps you to find out which of your Twitter updates made people follow or leave you. Now it's certainly an interesting incite into the way people view you on Twitter but it has to be taken with a pinch of salt. It basically suggests that for every tweet you gain or lose followers. Twitters not that straight forward in reality, people sometimes choose to follow/unfollow an the basis of other reasons not as the result of a single tweet. That being said it's still a good barometer for your news feed.



7. http://www.twitpic.com/ TwitPic - Basically this site lets you share photos on Twitter. You have a Twitter account, you therefore have an implied TwitPic account. If you send a photo from your phone or whatever to your Twitter account it gets hosted on this site. Settings allow you to either automatically post the image to your Twitter feed or manually do it at a later date.




8. http://bit.ly/ Bit.ly - a site where you can pump in your stupidly long URL and get it shortened to a much more Twitter-friendly (below the 140 character limit) length. Or alternatively there's http://tinyurl.com/ the original URL shortner




9. http://www.twitalyzer.com/twitalyzer/index.asp Twitalyzer - a tool to evaluate the activity of any Twitter user and report on relative influence, signal-to-noise ratio, generosity, velocity, clout, and other measures of success in social media. This one's okay but again I have my reservations. Yes they're using some interesting metrics to measure success but it seems to me that success in Twitter will always be measured by the number of followers you have, it doesn't matter which way you cook your stats!



10. www.tweetdeck.com Tweetdeck - Doesn't really need any introduction but ...it's an extremely popular desktop app that makes it easy to break feeds into manageable chunks, and even categorise replies. Requires a computer running Adobe Air, but works on both PCs and Macs.

11. http://www.twittergadget.com/ TwitterGadget - is just one of dozens of Twitter clients for iGoogle. It allows you to do almost everything you can do in Twitter but from your iGoogle homepage. You've got to be a fan of iGoogle to buy into this way of working, which I am!

Read Part Two of this post (more Twitter tools)

Striking while the iron is hot




Probably the holy grail of any web optimisation exercise is to be serving the most optimised page content at the most effective time possible. For us in banking this opportunity presents itself once a year in the form of the ISA Season*.

We were keen to improve the conversion of ISA landing pages in preparation for this busy sales time for the bank. This test was for an ISA savings product page and we were looking to test header copy, images and body copy combinations in order to improve conversion. The winning design would be implemented in time for the peak volumes during this period.

The Hypothesis for this test is as follows:

By Multivariate testing combinations of header copy, images and body copy on the ISA product page we expected to see a 10% improvement in number of visitors starting an application for a product and improve out visitor to submitted application ratio by 5%.



This test was run twice (Between 13/03/2009-06/04/2009 (Pre Tax Year) and between 06/01/2009-13/01/2009 (Post tax year)). We were forced to conclude the second test early due to the product being withdrawn but in both instances the winning design was the ‘Right hand side tabbed content’ although we did have alternative images winning in each test. In the final test we saw the lead design resulting in a 16.14% increase in conversion (17.13% in the first test) over the default design with a statistical significance level of 98.65%.



* End of the old tax year and the start of the new one and something many in the financial services industry call "ISA Season"

Omniture Summit 09 London - Key Takeaways


I attended the Omniture Summit 2009 in London yesterday, travelling down to the big smoke from Leicester. We paid £178 a head to attend this shindig attended by around 800 other like-minded individuals. Although we dont currently use any Omniture tools (we're strictly Maxymiser right now and frankly dont have that kind of money) we thought we'd see if we could get some optimzation ideas from our fellow analyst & marketeers at the event.

So my takeaway learning's for the day were:

1. You're justified in letting brand design and message go out the window if there's a decent uplift in conversion to be had as identified by your testing.
2. BT segment their customer by IP range. If the customers IP is within the BT ISP range then they're an 'Existing Customer', if they're not they are a potential 'Switcher'.

3. AutoScout24 use Google Adwords to segment they're customers. Those that search by brand and those that don't. If you arrive at they're site via a keyword you're part of their optimization test.
They've learnt that those who search by brand prefer a highly branded landing page, while those that don't use a brand term prefer a lesser branded landing page.

Breakout talk - 'Listen to your audience: Test Your Content & Learn Where to Take Segment-specific Action'. Speakers: Sam Calvert, Head of Online Sales, British Telecom, Sebastian Wetteraurer, Teamleader Web & Data Analytics AutoScout24.

4. What worked well in AB Testing :

Multiple calls to action
Empty pages convert better than fuller pages
Google users display different behaviour to non-Google users (although they didn't specify what behaviour)
Experiments need at least 2 to 3 weeks to run

Breakout talk - 'Why Should A/B Testing Be in My Marketing Strategy for This Year?'. Speakers: Nicolas Meriel, Team Lead Test&Target Consulting Services, Omniture. Gerard Lindeboom, Senior Web Analyst ABN AMRO.

5. Realise that emotion is a big factor in website design.
6. Usability labs are invaluable in understanding your customer experience.
7. Look after your customer. Delight and surprise them and they'll tell the world.
8. Aim to make navigation perpetual or persistent.
9. Familiarise yourself with customer language, i.e. do they search for a 'settee' or a 'sofa'?
10. Male & female search methodologies are different.
11. Relevancy is key so dynamic landing pages are essential.
12. Promote best products and offers in prime areas of your site. Products need to continually prove themselves for home page placement.
13. Optimise for any seasonal or opportunist event.
14. Keep up-to-date with search terms.
15. Abandonment - understand where you are losing customers and use testing to fix.
16. Establish a realistic conversion target.

Breakout talk - 'Flex Your Merchandising Muscles: Using Omniture Merchandising to turn shoppers into buyers. Speakers: Alison Lancaster, Marketing Director Harrods Direct, Chris Moffatt, eMerchandising Consultant, Omniture.

The optimization of a landing page test

We have an AB test running on a product page and landing page simultaneously to see which location would convert better. The background to this is that we once used landing pages a lot, and then the industry as a whole (banking) decided to move over to dropping people straight onto a product page. It became the established wisdom that product pages were better than landing pages for optimising your traffic. This test sought to revisit this hypothesis. We decided to use a specific traffic source, identify them on page load, split the traffic 50/50, 50% remaining on the product page, 50% redirected to a bespoke landing page.

As well as testing the different locations we wanted to test a few other concepts of optimisation opportunities we'd heard about:

1. Do bespoke welcome messages help in converting users to apply for the product?
2. Do the presence of primary navigation in landing pages actually help visitor conversion?
3. Does a 'band wagon' message help uplift; i.e., "join the thousands of other visitors in choosing this product" etc.

When we started the test we soon found out that the traffic source chosen to use as the test audience was a lot smaller than anticipated. Within a few days we had to adjust the 50/50 weighting and send 100% of the traffic to the landing page with a view to retesting whether product pages work better than landing pages at a future date. If we'd left it as it was we would still have been running the test 2 years later, so low was this traffic to the page from that specific traffic source.

As the test is still running we're still unable to answer question 1. 'Do bespoke welcome messages help in converting users to apply for the product?'. However, within the first 2 weeks we were able to answer questions 2 & 3. The primary navigation in the landing page performed very badly, allowing users to leak out from the test page and not convert. The welcome message using the band wagon theme also failed to convince any users to apply and also performed very badly. Both the presence of the primary navigation and the band wagon message are shown in the test page combination below (as taken from our Maxymiser test console). Upon conclusion of this test we will re-run the winning test variants on the product page to see again which page gets the biggest uplift in conversion, this will be covered by Part two of this post.

What is Statistical Significance?





I've sort of overlooked this topic since establishing this blog but for subject completeness shall we say, I think I should now mention the role of statistical significance in optimisation testing.

One of the biggest headaches to running an AB test or Multivariate test on your website is knowing when your test is complete, or heading towards conclusion at least. Essentially how do you determine signal from noise?

Many 3rd party tools give you the metrics to determine a tests conclusiveness, for example the Maxymiser testing tool displays a 'Chance to beat all' metric for each page combination or test variant within your test.
But more importantly, what underpins these tests is the concept of statistical significance. Essentially a test result is deemed significant if it is unlikely to have occurred through pure chance. A statistically significant difference means that there is statistical evidence that there is indeed a difference.

Establishing statistical significance between two sets of results allows us to be confident that we have results that can be relied upon.

As an example, you have an AB test that has two different page designs. Analysing the data shows there are two results:

Page 1 - 1,529 generations with 118 responses or actions - giving a conversion rate of 7.72%.
Page 2 - 1,434 generations with 106 responses or actions - giving a conversion rate of 7.39%.

Looking at the two results which do you think is the better? Is page 1 better because it as a higher conversion rate that page 2? Using statistics and firing those 2 results through a basic Statistical Significance calculator (I'm using this one Google's Optimizer test duration calculator) tells us that the two results are 0.335218 standard deviations apart and are therefore not statistically significant. This suggests that it is highly likely that it is noise causing the difference in conversion rates, so plough on with your testing. If a 95% statistical significance is acheived you can safely say the test is onclusive with a clear winner. This is also indicative of a strong signal and gives you a result based upon a wholly statistical basis as opposed to human interpretation.

The HiPPO factor


Just learnt a great new acronym today 'HiPPO' - Highest Paid Person's Opinion.
Never been more apt than when you're doing an optimization exercise on your website
and the 'HiPPO' blows your proposed new design out of the water because they dont like it based upon purely subjective thinking.

Found this on Dave Chaffney's site in an interview with Avinash Kaushik, author of Web Analytics - An Hour A Day.

The reptilian brain & your inner buy button

When we do AB testing or multivariate testing you are generally experimenting with a persons unconscious response to your test content. Basically does a red apply button work better than an orange one for instance. Marketing experts refer to this as connecting to or appealing to the primitive or 'reptilian brain'. When someone arrives on a landing page and is making a quick intuitive based decision you are generally dealing with the inner, more reactive brain, not the higher conscious brain. Your not asking for any in-depth analysis of your page content you just measure what works best for getting a person to perform a desired action in your test. It seems for the majority of the time the reptilian brain is at the driving wheel so we need to talk directly to the driver.





This is considered to be neuromarketing in it's most basic form. The study of how we respond to adverts and products at a neurological level. In research in this area volunteers have their brain activity monitored via brain scanners whilst being exposed to marketing media to measure their response. The ability to understand how the inner workings of the brain processes images and messages and reaches decisions potentially gives marketers a new tool to fine-tune ads and marketing campaigns, bolster and extend brands and design better products. Marketers’ use of neuroscience technologies has alarmed some consumer groups who fear that it could lead to the discovery of an 'inner buy button', which, when pressed, would turn us into automated shoppers. Such fears spring from the increase in marketing-related problems such as pathological gambling, obesity and Type 2 diabetes.

The power of the testimonial

Been running a multivariate test on a current account product page again. This test has been running for over 3 weeks and not performing too well it could be said. Below is the readout from our Maxymiser console for this test.



So we sat down and thought how we might turn this under-performance around. I had been hoping to try and use a customer testimonial in a test for a while so I suggested that this might be the time & place to do just that. On the product page we had a test element that displays a strapline immediatly under the product title. So we took the worst performing variant and changed it to read a short & snappy customer testimonial instead (see before & after below).



Now we re-commenced the test. We didn't go for a complete restart, we just take the test results from the new start date of when we added the testimonial variant into the mix. And a few days in and the difference is incredible, the top performing page combination by a wide margin contains our new testimonial, this is identified as V_013F on the report below. So just goes to show, the test is still running at time of writing but we'll see if this performance degrades over time and whether this new test variant ends up in the winning page combo. At the moment though a 26% uplift in conversion shouldn't be sniffed at, and it's another thing we can tick-off as being tested.












UPDATE: 24th March 2009


I thought I'd better update this post today to take account of what's happened to this test. So the test has been running now for 63 days. Below is a screenshot of the report console for today. As you can see the V_013F variant (which is the testimonial variant) is placed in the top 3 winning page combinations. On the embedded graph I've pinpointed where we introduced this customer testimonial variant into the test. It seems that after a brief lift the test settles back down into a level pattern acheiving a roundabout 11.5% uplift in conversion, no longer retaining the 26% uplift first attained. The Maxymiser 'Chance to beat' metric also hovers around the 46% level for the best page combination (P10) so the test seems in a state of flux and not really progressing towards statistical significance. I should also point out that we've culled a few negatively performing variants from the test.

Seasonal shift

The established wisdom of web page optimization suggests that images, and particularly images of people help build a connection between the visitor and your content/product/brand, See earlier post about quick wins regarding this.

We have been testing this theory through ongoing multivariate (multivariable) testing on several product pages on the site. However we've inadvertently created a rod for our own backs.

The original test was started back in November and we gave it an autumnal theme. The test variant with these images originally performed well but a gradual errosion in performance occured as the test and time wore on. The test was still running as we entered the Christmas period and suddenly the autumn theme looked out of place so the drop in visitor appeal was understandable. To confirm this we revisited the variant and replaced the autumn images with winter based ones. Again an uplift in performance was immediatly acheived. The test is still running and currently producing a respectable 8.95% uplift in conversion; but it's still wintery outside with the occasional snow flurries to reinforce that, so maybe this explains the appeal. Now as time moves on it will be interesting to see if this begins to errode in performance too.

So we have a situation where we know a particular content works but it will require constant maintenance to keep on top of the shifting seasonal themes. We'll need to come up with a sustainable image or theme that will work as well despite the time of year. We still have an MVT test on the roadmap for March/April that plays on the Spring theme but hopefully this test will be it's last incarnation. All of the past and proposed images are below, we've nicknamed this variant 'Shallow Grave' due to the half-buried nature of the models in shot.
AUTUMN > WINTER > SPRING.....

Emphasis of key words

A recent finding of our optimisation testing on our product pages is that if you embolden key words that emphasize a positive aspect or feature of your product you can more or less guarantee an uplift in conversion. The image below is taken from a recent winning test variant of a multivariate test we conducted on one of our savings product pages.





This winning page design resulted in a 4.32% improvement in conversion of visitors clicking on the ‘Apply Now’ button and more importantly it resulted in an 8.21% improvement in the conversion of visitors submitting the product application form. We're now rolling this keyword emphasis out across our other product pages.