Secret tricks for getting reliable results from your ab testing tool

Page 1

Secret tricks for getting reliable results from your A/B testing tool

Conversion rate optimization is something that lets you squeeze out the most from your existing assets. By assets, I mean the website traffic that you already have. A/B testing plays a very crucial role in your CRO campaign. How? What happens when you carry out an A/B testing on your website? Just Google it, you will find a ton of articles with two ton of pieces of advice on A/B testing and A/B testing tools. However, you will hardly find an article that does not conflict with another article in the same search result. It makes you more confused about which A/B testing tool to use.

Here I am going to discuss some secret tricks that will make sure that you conduct a successful A/B testing campaign. In fact, these are not so secret at all, but you will often find A/B testers missing them out. So, let’s call them secret tricks for now.


Tricks to get reliable results from your A/B testing tool 1. Do not assume; make hypothesisIf you have gone through the A/B testing related articles, you might be already confused that should I assume what my visitors would like? Well, the big orange button worked for a popular site, so it will work for you as well; Is it so? I am afraid that it's not the secret trick. Instead, it might just worsen your conversion rates. If anything worked for someone that doesn't mean it will work for you as well. This is called assumption and it's not good at all. There are more misunderstood assumptions that are common among the marketers relying on A/B testing tools. For example, yellow color converts better, bigger buttons in yellow color will accelerate the conversion rates etc. The secret trick is to never make such assumption before conducting the test. Colors of CTA buttons and other elements are decided on the basis of how they are standing out from the rest of the page elements. So, yellow might not always be the best choice. Instead, you should start your test with a hypothesis, but do not just presume already that results will be positive always. Most of the times results are different that expected.

2. Understand your visitors’ needs firstThe most effective way to know your visitors is to ask them directly. When you ask questions from your visitors it helps you to understand the efficacy of your hypothesis and you get a more confirmed detail about whom you have to serve. You can create online surveys to ask questions from your visitors. Survey Monkey could be a good option in this case. Another option you have is to conduct a heatmap tracking on your website that will track your visors activity on the site. On the basis of the statistical results shown by the heatmap tool, you can uncover the actual needs and requirements of your visitors. For example, if you are concerned about the CTA button on your landing page, you can use the heatmap tool in your aid to your A/B testing tool. The heatmap result will let you know how many visitors actually click on that landing page and how many of them get distracted. In this way, you can create a variation for A/B testing project with a more solid hypothesis. You can use MockingFish tool here, as it provides a combo of both the A/B testing tool and a heatmap tool free for one year.

3. Verify the confidence level using statistical significanceThere are two things you need to verify after conducting the A/B testing- Statistical significance and


Statistical confidence. Statistical Significance in common terms is the level of surety that the obtained A/B testing results are reliable for final implementation. While the Statistical significance is the likelihood of chances that a certain result would repeat again. These two factors strictly depend on some other factors mentioned below: Sample size: If the sample size for your test is too small, the Statistical Confidence and Statistical Significance can never be judged reliably. So, make sure that your AB testing tool is fed to a population size sufficient for obtaining confident results. In fact, you should also make sure that your website already receives sufficient traffic that your A/B testing tool can use accordingly. Conventionally, the traffic is divided as 50-50 among the variation and original page so as to maintain an unbiased environment for both the variables. Test elements: In order to save time, you might think that testing multiple variables in a single A/B testing campaign would be wise. That is, you might want to change a combination of elements on a page and test them. In simple terms, you want to conduct a Multivariate testing instead of A/B testing. Please remember that A/B testing and Multivariate testing are two different approaches with different considerations. So if you are conducting an A/B testing, just do no rush and test single variable at a time. Time duration: Time duration is a crucial factor in deciding the confidence level of your A/B testing results. Your A/B testing tool will not stop you from winding up the test early, but it is not recommended to stop early. By stopping early you might fall prey to the false positives where one variation might appear to be the best. However, after actual implementation, it might counter the conversion rates instead. So, it is very important to conduct A/B testing experiment for sufficient time so that the A/B testing tool will have some solid data to decide the confidence level.

Finally, A/B testing is not a one-time process, most of the times you have to re-test the same variations to get a more solid result. Also, you cannot stop A/B testing, keep testing whenever you want to make a change on your website, or when you feel like your website needs a change. It will always help you to make some informed decisions that will not harm your current conversion rates at least.


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.