1300 785 122Mon. - Fri. 9:00am-5:30pm

3 Myths of A/B Testing, Busted

Posted by in Conversion Optimisation

Have you heard this line? “Got a low conversion rate? Run an A/B test and you will increase enquiries and sales by 2179%!” Most often, A/B Testing seems to be the answer to increase conversion rates and little thought goes into the process. While setting up and running a test isn’t difficult since there are so many user-friendly AB testing tools out there, doing it right requires thought and care.

Myth 1: A/B tests will ALWAYS produce huge gains

Misconception

It’s no surprise many think this is the case. There are so so many case studies online where tests have increased the number of sales by 400%. Just to name a couple:

  • abtests.com
  • whichtestwon.com

The truth is that only the high performing tests are published and the ones that fail never or rarely are. It’s just like Hollywood – you only hear about the actors that are rich and famous, but there are many other actors which just aren’t in the spotlight.

The truth

“Only 1 out of 8 A/B tests have driven significant change.”

Appsumo founder, Noah Kagan 

Most AB tests will fail, that is, there will not be a lift in conversions compared to the original design. So it’s good to set the right expectations. If you expect every test to run perfectly, you will find yourself disappointed. 

A/B testing is a learning experience


 “I have not failed 10,000 times. I have successfully found 10,000 ways that will not work”

- Thomas Alva Edison, Inventor of the Light Blub

You will often find that sometimes you will create an excellent hypothesis with the expectation of a 200% rise in conversions – only to find it performs worse than the original. Whether you are a professional at A/B testing or something who is just getting started, this can be disheartening, so you need to always think of the A/B testing process as continuous improvement and learning from mistakes.

Conversion Optimisation is a process, not a once off solution. Improving conversions is like developing your skills in anything you do. You will have to do it again and again, and often it takes many tries to gain insights into what works and what doesn’t often in incremental gains. To be realistic, predict a 10% gain here, -1% gain there, -5% gain here and then a 7% gain there and then it will all add up for a big improvement.

Focus on the WHY

The purpose of A/B testing is not to get continuous lift in conversions, but learning something about your target audience and using those insights across your marketing efforts. For example, a certain headline does not work on your tests – so don’t use this headline again in anything you do (until you test it again and it becomes successful).

How to make the most of mistakes

When a test fails, here are some useful things you can do:

  • Evaluate the hypotheses – is it as strong as it could be? Could it have more data to back-up your reasoning?
  • User analysis – Look at the heat maps made to assess user behaviour on the site, what are they doing differently to what you expect?
  • Always be learningHere’s a case study on how ContentVerve turned a losing variation around by analysing what exactly doesn’t work on it.

Myth 2: A/B tests only need a few days

Did you know that statically, when you flip a coin, you are more likely to get a coin on its head twice as likely as on tails? Don’t believe me? But it’s true – I flipped a coin 3 times, where 2 of the 3 times it landed on heads!

Running an A/B test without thinking about statistical confidence is worse than not running a test at all—it gives you false confidence that you know what works for your site, when the truth is that you don’t know any better than if you hadn’t run the test.

- Noah from 37 Signals

The 95% confidence level is a good base for making any conclusions. When the confidence level hits 95% the chances for a number caused by chance is very unlikely. If you are taking a long while to reach statistical confidence, consider looking at your variations – are they different enough? The smaller the change, the longer it will take to reach a strong confidence level.

Be patient

Don’t be discouraged by the sample sizes required – unless you have a very high traffic website, it’s always going to take longer than you’d like. Rather be testing something slowly than to testing nothing at all. Every day without an active test is a day wasted.

Focus your experiments around understanding

If you have been doing any of these types of tests you need to STOP:

  • Different shade of grey for background
  • Different coloured buttons
  • Moving an item 5 pixels to the left

“Next time I see an article telling people to increase their conversion rate by using one color instead of another, I’m going to cry.”

NaomiNiles

Stop wasting your time by testing out minuscule changes that are more than likely overlooked by users. Colours sometimes will affect results as certain colours may make the call to action buttons stand out stronger than other colours, but if you are testing red vs orange - ask yourself – WHY? Have you understood the audience to believe that such a subtle colour change will affect your sales that much compared to experimenting with an offer or price?

Myth 3: I don’t need to test because other’s have done it already

What works for market dominator such as Amazon and eBay does not mean it will work for you. While seeing examples is a good inspiration point to get the creative juices flowing copying the homepage of Amazon to your own website will hurt you and waste you lots of money. The only time you could copy a competitors website is:

  • You have the exact same target market group (visitor by visitor)
  • You sell the exact same product (and I mean EXACT)
  • You market the exact same way (same AdWords, SEO Keywords)
  • You have the same sales process
I could list a lot more, but from the list so far, I don’t believe any company can match one of their competitors. Even IF you know Amazon spends millions on their market research, it doesn’t mean their research will apply for your company. Never assume!

If you want to really improve your conversion rate, you need to do some heavy lifting – study! – spend more time on planning and research to find out:

  • Why people aren’t turning into sales
  • Where people are going instead
  • Why people trust your competitors websites more than you
  • At which point are you losing your customers
and form your experiments and strategies around these factors instead.
If you would like to know more about A/B testing, drop me a comment below.

About

is the head of Conversion Rate Optimisation and Design. Armed with an eye for analytics, she also has a background in graphic design which compliments the online marketing activities of E-Web Marketing perfectly - working closely with SEO, PPC and Social departments. Amy is dedicated in helping businesses increase their online trust factor and brand credibility. She often holds both private and group workshops to teach business owners how to get more sale and enquiries from their website. By breaking down the websites in details, giving demo of helpful tools and providing visual recommendations of changes to be made, business owners have achieved higher web conversion rates, more revenue and more engagement.

Leave a Reply