The five most common mistakes in testing

A/B and multivariant tests are important tools to evaluate the performance of campaigns on the website. The results can be used to further improve setups and tackle additional actions to optimize and personalize the user experience. But there are a few stumbling blocks lurking when using A/B testing. In today’s blog post, we’ll point out the five most common testing mistakes and how to handle them.

Mistake 1: Starting to test without a plan

A common mistake: setting up tests just for the sake of testing. Instead, you should first and foremost make a plan.

The following questions help when setting up A/B and multivariant tests: Which target group do you want to reach with the test? Which parts of the shop or which step in the customer journey should be improved? Which KPI should be optimized (e.g. bounce rate or conversion rate)? Where do you suspect the previous error lies, and what do you need to change? All these questions then lead to the formulation of a meaningful hypothesis. This is the basis of a successful test setup.

 

Would you like to know how a sample hypothesis could look like?

 

Then download our free whitepaper on A/B and multivariant testing!

Mistake 2: Ignoring important key figures such as significance and confidence

Another mistake in testing: not knowing confidence and significance. Both are important indicators for the correct evaluation of A/B and multivariant tests.

Confidence is the probability that a certain statement will be true. In practice, a confidence level of 95 percent is usually recommended.

Significance, on the other hand, indicates the point at which the correlation between the two variants is no longer merely coincidental, and the result can thus be applied to the entirety. This sounds complicated, and it is at first. A significance calculator helps – and is already integrated in our platform. You can be sure that your test is ready for a statistically valid evaluation.

 

For a detailed explanation of significance and confidence, read our whitepaper on A/B and multivariant testing!

Mistake 3: Testing at the wrong time and ending tests too soon

Another common testing mistake: bad timing. If you choose the wrong time period and the wrong duration, the test results can quickly become distorted.

The right timing for testing is important: seasonalities such as Christmas can lead to fluctuating traffic and purchases. Therefore, it makes sense to rather run tests in neutral months without special holidays or seasonal peaks. Otherwise, the tests may yield false positives.

The duration of an A/B or multivariant test is also crucial. Depending on the shop, the ideal duration may differ, as it depends heavily on traffic how fast a test becomes significant. Test duration calculators help for a first estimation. With high traffic, test results can be significant very quickly. However, tests should never be ended too early. Otherwise, you run the risk of drawing entirely wrong conclusions, as the numbers might have changed again in the further course of the test.

Mistake 4: Killing several birds with one stone

A/B tests are tempting for many marketers because you can quickly find out whether a setup works. Does that mean it’s best to test everything at once? No, as too many tests at the same time could influence each other and falsify the result. However, if the campaigns tested run on clearly separated pages, for example the homepage and a category page, A/B tests can also be set up in parallel. In addition, the user flow should also be included.

Mistake 5: Not testing at all and optimizing according to gut feeling

“When you have to consider so many things, it’s better not to test at all. After all, I know best what my users like.” Again, a big mistake, if not the biggest. Gut feeling can also be deceptive. Shop owners may know their business better than anyone else, but in our experience, tests often yield surprising results. And if you really want to know which actions are worthwhile, you can’t get around A/B and multivariant testing.

With a proper plan and the help of the right tool, testing is no longer witchcraft. With our whitepaper you are well-prepared to get started, as we’ll show you how to avoid common pitfalls and how to set up proper tests. Additionally, we have collected a lot of inspiration from different test setups with our customers.

 

What are you waiting for? Download the whitepaper now and start testing successfully

 

LinkedIn
Facebook
Twitter
Related Posts
The Impact of A/B Testing: How Testing Different Marketing Strategies Can Lead to Better Sales Outcomes
Businesses are facing intense competition in the e-commerce landscape, where over 2.14 billion people...
Read More
Personalization in the Telecommunication Industry
Here are 5 effective strategies on how to personalize your telecommunication online shop.
Read More
Boost Sales with Upselling and Cross-Selling in Your Online Shop
We have some tips for you on how to boost your sales with upselling and cross-selling.
Read More
Categories
Previous
Next