The five most common mistakes in testing

A/B and multivariant tests are important tools to evaluate the performance of campaigns on the website. The results can be used to further improve setups and tackle additional actions to optimize and personalize the user experience. But there are a few stumbling blocks lurking when using A/B testing. In today’s blog post, we’ll point out the five most common testing mistakes and how to handle them.

Mistake 1: Starting to test without a plan

A common mistake: setting up tests just for the sake of testing. Instead, you should first and foremost make a plan.

The following questions help when setting up A/B and multivariant tests: Which target group do you want to reach with the test? Which parts of the shop or which step in the customer journey should be improved? Which KPI should be optimized (e.g. bounce rate or conversion rate)? Where do you suspect the previous error lies, and what do you need to change? All these questions then lead to the formulation of a meaningful hypothesis. This is the basis of a successful test setup.

 

Would you like to know how a sample hypothesis could look like?

 

Then download our free whitepaper on A/B and multivariant testing!

Mistake 2: Ignoring important key figures such as significance and confidence

Another mistake in testing: not knowing confidence and significance. Both are important indicators for the correct evaluation of A/B and multivariant tests.

Confidence is the probability that a certain statement will be true. In practice, a confidence level of 95 percent is usually recommended.

Significance, on the other hand, indicates the point at which the correlation between the two variants is no longer merely coincidental, and the result can thus be applied to the entirety. This sounds complicated, and it is at first. A significance calculator helps – and is already integrated in our platform. You can be sure that your test is ready for a statistically valid evaluation.

 

For a detailed explanation of significance and confidence, read our whitepaper on A/B and multivariant testing!

Mistake 3: Testing at the wrong time and ending tests too soon

Another common testing mistake: bad timing. If you choose the wrong time period and the wrong duration, the test results can quickly become distorted.

The right timing for testing is important: seasonalities such as Christmas can lead to fluctuating traffic and purchases. Therefore, it makes sense to rather run tests in neutral months without special holidays or seasonal peaks. Otherwise, the tests may yield false positives.

The duration of an A/B or multivariant test is also crucial. Depending on the shop, the ideal duration may differ, as it depends heavily on traffic how fast a test becomes significant. Test duration calculators help for a first estimation. With high traffic, test results can be significant very quickly. However, tests should never be ended too early. Otherwise, you run the risk of drawing entirely wrong conclusions, as the numbers might have changed again in the further course of the test.

Mistake 4: Killing several birds with one stone

A/B tests are tempting for many marketers because you can quickly find out whether a setup works. Does that mean it’s best to test everything at once? No, as too many tests at the same time could influence each other and falsify the result. However, if the campaigns tested run on clearly separated pages, for example the homepage and a category page, A/B tests can also be set up in parallel. In addition, the user flow should also be included.

Mistake 5: Not testing at all and optimizing according to gut feeling

“When you have to consider so many things, it’s better not to test at all. After all, I know best what my users like.” Again, a big mistake, if not the biggest. Gut feeling can also be deceptive. Shop owners may know their business better than anyone else, but in our experience, tests often yield surprising results. And if you really want to know which actions are worthwhile, you can’t get around A/B and multivariant testing.

With a proper plan and the help of the right tool, testing is no longer witchcraft. With our whitepaper you are well-prepared to get started, as we’ll show you how to avoid common pitfalls and how to set up proper tests. Additionally, we have collected a lot of inspiration from different test setups with our customers.

 

What are you waiting for? Download the whitepaper now and start testing successfully

 

LinkedIn
Facebook
Twitter
Related Posts
Looking Back at Our Employees' Favorite Features in 2024
Explore the top features of 2024, from personalized recommendations to AI support, and get a sneak peek...
Read More
Get Your Online Shop Ready for the Holiday Season: Tips for Success
The holiday season is approaching! Here are some tips for your Online Shop.
Read More
trbo Insights – Aaron Mandel's Favorite Feature
Discover how to create rule based and dynamic segments to deliver the most personalized online shopping...
Read More
Categories
Previous
Next