Winning from Losing

How to recover insights and inspiration after an A/B test fails

How to recover insights and inspiration after an A/B test fails

No one hits 100% of the shots they take. What matters is your ability to adjust your stance, learn the best way to move about the court, and ultimately improve your game moving forward.

The same is true with conversion rate optimization (CRO). Too often teams are worried about their win rates and quickly move past a “losing” test without digging in to understand the “why” behind the test. Why did our customers not respond to the new change on the site? Why was our initial intuition off the mark?

Keep reading to understand how you can explore why an A/B test lost, so you can learn and iterate, ultimately improving your customer experience and driving up orders and revenue.

 

Capture supporting analytics data

A robust set of analytics data will let you know “what” happened. You may understand:

  • if customers interacted with and signed up for an email list.
  • If customers used the new module you added.
  • And most importantly, if more customers purchased (or existing customers purchased more)

A well designed A/B test will plan for the losing scenario. It will include enough supporting metrics to be able to identify what customer behaviors changed at the place of the change, as well actions in proximity to the change and throughout the rest of the customer journey.

From this “what,” you can begin to infer the “why.” When you see a decrease in orders and fewer visitors using the navigation after adding a carousel to the homepage, you may infer that the new carousel drove their attention to products that were less relevant. However, your understanding and storytelling about the A/B test can only go so far based on analytics and UX expertise alone.

 

Visualize customer behavior

For most tests we run at Experiment Zone, we set up click tracking and heatmaps using tools like Hotjar and CrazyEgg. With these lightweight tools, we can observe whether visitor behavior changed on the test variant compared to the control. We are also able to see whether or not visitors were scrolling far enough down to even see the changes that were made.

 

Catch customers in context

Sometimes you’ll realize that there’s some additional context you need to understand the analytics or click tracking data you are capturing in a test. You may want to know whether visitors notice the new elements on the page or why they decide to leave a particular page.

You may consider doing a survey to ask them what they are doing and why. Why are you leaving checkout? What questions do you have about this product? You can target specific pages to ask questions by using an intercept survey tool like Hotjar or Ethnio.

 

Run a usability study

Sometimes the only way to be sure what’s causing the outcome of an A/B test is to observe the behavior of visitors in a usability study. A usability study makes sure that a product or page is built to be convenient to use and easy to figure out.

To start, you construct a scenario and tasks to reflect the expected user journey on the site. You can send 5 participants to the control variant and 5 participants to the test variant. Ask participants to think aloud and explain what they are noticing and what might be confusing. While participants are working, make sure to observe their body language, facial expressions, and emotions to identify what is confusing and what pain points they have with the experiment variant and control.

Want to know how we do this? Check out our series: Get More Bang for Your Buck with the Latest, Low-Cost Remote Research Tools.

The qualitative data from a survey or from a usability study can answer why customers do what they do. By complementing your A/B with qualitative research, you can triangulate the “what” and the “why” to have a more complete understanding of the test results. Complementary quantitative data from A/B testing and qualitative data from UX testing provides a more complete data set for better decision making.

As a bonus, you may get some additional inspiration from your customers on what to do or test next! So don’t be discouraged by a losing A/B test – instead, lean in and understand what happened and what you can do in future tests.

Get our tips and tricks for Experience Optimization sent to your inbox!