Digging Deeper on Inconclusive Tests

Digging Deeper on Inconclusive Tests

 

*This article makes mention of Google Optimize. Google has officially announced the sunset of Google Optimize. Both Google Optimize and Optimize 360 will sunset on September 30, 2023…. Read More

If your usual response to inconclusive test results is to move on to the next experiment, you’ll want to hear about how she and her team got a huge win after digging deeper into some flat results.



Quotes from the podcast

  • “We just couldn’t figure out why this didn’t work. We’ve done this type of experiment with other companies wanting to highlight Why this brand? and so often that’s really compelling for people.”

  • “It wasn’t the wrong idea. It was just the wrong execution.”

  • “It was really kind of nagging on us. Why did people not respond to this piece of content that is so often motivating other contexts? So what we thought was, let’s do some usability testing.”

  • “They said it was vague. They basically were like, So what? … and we got this feedback consistently, no matter how we presented it, whether it was desktop or mobile.”

  • “It was a real challenge for [the client]. They had to go back to the drawing board and rethink their value proposition.”

  • “I think what optimization is all about is being open to challenging what a good idea is.”


Transcript

Brian:

What do you do when you run an experiment and it turns out inconclusive? You might’ve read that seven out of 10 AB tests are inconclusive anyway. So you just move on with life. Well, today we’re going to hear from AJ Davis about a time that she and her team dug in their heels on inconclusive experiment, figured out what had happened and managed to get a win out of it. AJ Davis is an industry expert in user experience strategy and the founder of experiments zone, which provides conversion strategy and testing for online businesses prior to starting experiment zone AJ lead optimization strategy for fortune 500 companies during her tenure at Clearhead. She was also the lead UX researcher on the Google Optimize* product. But today she’s here to tell us about a series of experiments with an IT software company. AJ welcome to the show.


AJ:

Hey Brian, thanks for having me.


Brian:

Yeah, my pleasure. Let’s go right into this. Tell us a little bit about this IT software company and the product that you were working on.


AJ:

Sure. Yeah. This is an IT software company who did annual subscriptions for software. They offered a really diverse, broad set of solutions and they were targeting customers that were IT, professionals who manage network servers and database performance.


Brian:

And we’re not going to name the company today, so we’re just going to call them Acme IT. And the experiments that you ran, you’re calling the, the Y Acme experiments. Can you go ahead and tell us what that was about?


AJ:

Sure thing. So for Acme IT, what their goal was was to get more people to do the demos on their website, as well as to sign up for free trials. And so what we did with this experiment was they had this block of content on their homepage that would explain why Acme IT, and it highlighted the specific reasons that they thought would make people really feel compelled to buy from them or to get that download from them. And we thought that having on the homepage was great. But what we found in the data is that most people weren’t seeing the homepage. So because people were landing directly on these product pages, we thought that showing this content on the specific software products would really make sure to reinforce that message. We were missing the opportunity to communicate with visitors who were just landing directly on the product pages. So we took the same text and the same design, and we put it on the product detail page


Brian:

Makes sense. So what happened? How did it go?


AJ:

The test was inconclusive and we were really surprised by this. We had expected that having a little bit of information about this business and the value proposition would be motivational for people. And we weren’t really sure why it was inconclusive, why this wasn’t changing, moving the needle for people as they came to the site,


Brian:

Right? So these are people landing on this page. They haven’t already seen this content, which was present on the homepage. And we’re talking about a sizable chunk of real estate on this page and you added it and nothing. Yeah,


AJ:

Nothing. And we knew people were reaching it because we had set up scroll tracking. So it wasn’t a matter of not getting exposed to the element. It was visible. It was something that their team had worked on and thought was the appropriate representation of those key value props that would differentiate them from their competitors. Nothing. We just really couldn’t figure out why this didn’t work. We’ve done this type of experiment with other companies wanting to highlight why this brand. And so often that’s really compelling for people. And the reason we think it’s compelling is because, you know, the, the values that our brand presents are things that humans connect with. And so you, as a consumer, want to see those values that are important to you reflected in that business. And by stating them plainly typically that leads to more conversion.


Brian:

Sure. You almost wonder why anybody was converting without this context, but again, you threw it in, you tested it in conclusive. It’s the worst feeling in the world. A lot of effort goes into a test. You wait, you watch flat line. So what did you do next?


AJ:

Well, we thought we had two options. So one option was just to like, you know, it doesn’t matter. We can just move on. And that, that happens sometimes, but it was really kind of nagging on us. Like, why was this really happening? Why did people not respond to this piece of content that is so often motivating other contexts? So what we thought was, let’s do some usability testing. Let’s talk to real people and have them encounter this piece of information in the context that we’ve presented it in the test and really understand and get their feedback, you know, did they stop and notice it? What were their observations about it? Did this play into their decision or not whether or not they should be downloading the software. And so we decided to go with the second round and we decided to understand what happens using usability testing.


Brian:

Okay. I think a lot of people who have an experimentation mindset, I’m just going to say, have maybe never done what you just described. So can you tell us a little bit about the setup or the approach or how did you do it?


AJ:

Sure. Yeah. We use usertesting.com to find participants who were IT professionals. So the first thing you need to do is you’ve got to get the perspective of people who represent that audience that would truly be coming to the site. So we came up with a list of criteria that represented who are our target group would be what matters to them, what behaviors they have in their life, what kind of job they have, where they work and be recruited five participants on desktop and five participants on mobile. Before we did the recruitment, we put together a test plan, which outlined a specific script and a specific order of tasks that we would have them go through, which we, we felt would represent a typical flow where you would be landing on this product page, naturally interacting with it. And then in the second half of the study, we would give them focused, targeted tasks to go look at specific elements, namely, this why us section.


Brian:

So you found an actual qualified audience and not just some yeah, who’s off the street. And you said you gave him a script. Okay.


AJ:

Yeah. With usertesting.com and any of these tools that are out there, you set up a specific task in a starting page URL. And so the participants would be sent to a specific page and then they would be presented with a little popup window that prompted them to do something. They are asked at the beginning and reminded throughout to think aloud and share what they’re thinking and what they’re experiencing. So it’s a unique opportunity to understand like how people are really responding and how they’re feeling about things is they’re encountering them for the first time. And after each of the tasks they say they’re done and they move on to the next one. And so it guides them through all the tasks, without them seeing all the specific places that the script will head, you know, the headline is that the why us wasn’t the wrong idea.


AJ:

It was just the wrong execution. What we found was that everybody, all of the participants that were asked about the wireless information couldn’t distinguish from that information, why they would buy from this it company versus other it software companies. They said it was vague. They basically were like, so what, and we got this feedback consistently, no matter how we presented it, whether it was desktop or mobile. And we decided that we needed to really come back to the drawing board and understand the specifics of the words that were being used and how we presented and what we presented as the Y Acme IT.


Brian:

So you ran the study and you got back some kind of data and feedback from these user testers. What did that look like? And what did it tell you?


AJ:

Yeah, we got overwhelming feedback that the YF information just wasn’t distinguishing them from other IT software companies. They saw it, they understood it. They could understand the intent behind it, but basically all of the participants were like, so what, how does this make this company different than other IT software companies?


Brian:

And I guess this is a technical topic and technical users. So this wasn’t obvious of course, to anybody at, but you got this overwhelming feedback from the user testing. What did you do with that?


AJ:

Yeah, it was really surprising. I, you know, I think that’s, it’s one of the benefits of doing user testing. As you often hear things that you just aren’t expecting because you get in your certain mindset or certain perspective about things. And that certainly was true for our team and what’s true for our client. So what we did was we took the video clips back to them and we said, Hey, here’s what happened. And they were equally surprised to this. We’ve been looking at this why us part of the page and trying to figure out the placement, how do we make people see it kind of working, worrying about the data behind it, or worrying about the experience behind it, but never questioning are we saying the right words? So we knew we wanted to communicate a certain type of thing and never stopped to question, are we even effectively doing that? And so by presenting those videos and having those conversations, I think it really caused the business to, it was a real challenge for them. They had to go back to the drawing board and rethink their value proposition. And that’s a challenge for a business that’s been been around for a long time and think about how do we talk about it, not just on the product page, not just on our homepage, but for the business


Brian:

Sure. Across advertising channels or in sales conversations. Yeah. It sounds like they thought we’re going to test a widget on a product page and it came back several levels, deeper, the fundamental level. So what happened? How did this play out?


AJ:

Yeah, it’s obviously it takes some time to rethink your value propositions and have lots of conversations about it, but, you know, they did the hard work and they, they challenged themselves to do that. And we did arrive at a more specific solution and where we were tested with the same setup, but only changing the text that we included in it. And it helps it conversion.


Brian:

Nice. Okay. So a lot of work and a happy ending, it sounds like this was probably an emotional experience for the client, for the company. How did this go over after it was all said and done, and the second experiment had run and proved a winner.


AJ:

I think it caused a lot of, a lot of reflection. There certainly was a bit of chaos in there too, of like, Oh, well, why did this test when there was a lot of kind of stopping and challenging our own assumptions, both us and for the, for the client. So we assumed that telling people why this brand was great would work and then it didn’t. Okay. Well then what do we do? Okay, well, we need to know why it wasn’t great. Oh, great. It’s because it sounds the same as everyone else. Oh, that’s heavy. And then finally coming full circle and finding something that is actually communicating why this brand is different and why this company can better serve IT professionals than other software companies. We kept moving it forward so that we would ultimately be able to learn something that could their business, as opposed to just kind of struggling and moving on.


Brian:

I I’ve been guilty of just moving on, but this story’s definitely gonna make me rethink that in the future. I’m curious about you, did this experiment change the way you approach optimization in general?


AJ:

Yeah. I think what optimization is all about is being open to challenging what a good idea is. Right. We work really hard to find the solution and we have a goal in mind and we’re like, here’s a solution. Let’s test it. It’s definitely going to win. And then it doesn’t. And so then we start building this acceptance that we’re not always right, no matter the best intentions, the best research, the best thought out process ahead of time. And so for me, it’s like, I’m asking a little bit more and challenging a little bit more before a test goes and then not letting it go if we’re not clear on what happened. So if a test wins or loses or it’s inconclusive, you know, being able to really clearly see why that happened. And if it’s not evident in that quantitative data, like taking that extra step to explore qualitative data. And I point to this example and some other examples and making the case for it, because I think ultimately that’s where the challenge lies is sometimes clients aren’t really hungry for answering the why they just want to improve conversion rates. And so sometimes it’s about knowing you have to help grow that relationship and trust to the point where they’re also sitting alongside you, wondering why we don’t know the answer and being curious enough and valuing the answer enough to continue down the path until you have the answer,


Brian:

You know, while I’ve got you here, I’d love to just talk a little bit more about the topic of user testing user research in general, since you have so much experience with it. So you mentioned the challenge sometimes of I’ll say selling the idea to a client or to stakeholders internally. Um, is it worth it, is it worth the time, the expense imagining that you had an unlimited budget to do user research say on this project we just talked about what would you do with that?


AJ:

I’ve actually been in the place of having an unlimited budgets with research. It’s really fun. Well, when I was an in house researcher at Google, that’s like the only time I’ve truly been in a place where you have nearly unlimited resources to support doing user research. I think in general, what happens with user research is that you have to come into it with a specific learning in mind. And so if you, you know, if you’re doing from a testing mindset and you say we did this AB test, it learned something, let’s do some research and study it, grades it’s user research in response to a specific test, a better approach would be to have sandwiched research. So for us to have done research where we did it, and then once we had a new value proposition having repeated the research and I’ve gotten feedback from that because we may have even found a better solution to put out there for the company.


AJ:

And so that would be my second version of better research within CRO. And then I think that it’s the best, the ultimate way of doing user research would be to, um, start every engagement with a usability study and then have regular check-ins on that same study. So like a benchmark study so that you are always in tune with what are the biggest problems that are happening on the site? How can we better communicate to the customers? How can we improve their customer experience? So it’s not usually feasible to do that from a budget timing, priorities perspective, but by having that opportunity over time to measure the usability moments and using that same methodology, it really gives you a consistent lens into where the best opportunities are.


Brian:

That’s a great answer. Yeah. So you’ve outlined kind of a spectrum where at one end there’s zero user research being done at all. It’s just experiment. And if something is inconclusive, move on to the next one. And then there’s this utopian vision where you have unlimited resources and you start with user research and you do it on an ongoing basis. I wonder if you could imagine somebody who is doing an AB testing driven optimization practice, who doesn’t really have experience with the kind of research methods you’re talking about today, but they want to get into it. They want to move toward that utopian end of the spectrum. Do you have a sense of what’s the easiest way to start? What’s something someone can go do right away without a ton of budget or buy in.


AJ:

Hmm. There’s a lot of methods in the user researcher tool belt. So we’ve only been talking about usability research today, which is one of the heavier lifting methods in a sense, okay, because you have videos to watch. You have like individual people to be recruiting and compensating for their time. So there’s certainly even more that you can do beyond that. That’s more expensive. But within that, within usability studies, the leanest approaches you can take are often unmoderated remote studies. So a tool like try my ui.com or usertesting.com are good tools to keep. These are research costs and recruitment effort really minimal. There’s other tools though. So if usability study feels scary or is expensive or doesn’t fit into your timeline, there’s really great tools. One of my favorite things to do is to run the five second test. It’s like five second test.com. And it’s just a chance to understand people’s first impressions.


AJ:

That’s a really good, like just getting started with a client, make sure to check that box, that when people to the website, they understand what it’s about. And a lot of times they don’t, or they don’t consistently understand it. And that’s a good starting place for helping the business. And you can show them the outcome of the study and be like, Hey look, all these people came to your website and here’s the huge variety of things they thought it might be about in between those two extremes there’s things like asking survey questions and contexts using a tool like Hotjar, you could also do customer interviews, which also tend to be a little less expensive. And then my favorite is gorilla research. The idea of just getting out there in the wild and starting to ask some questions of people who are proxy users. So for example, like talking to people who work in the physical store about how they are selling things or people who answer customer service lines instead of talking to the customer herself.


Brian:

Perfect. Thanks. Okay. You just gave us a bunch of great ideas, techniques, and also resources, which I will link out in the show notes for everybody who’s interested. Speaking of which, where can people find you online?


AJ:

So you can find me at experimentzone.com. That’s my company’s website. I’m also on LinkedIn, AJ Davis. I love getting random requests for coffee.


Brian:

Perfect. Okay. AJ, thank you so much for your time today.


AJ:

Thanks so much for having me, Brian.

Get Future Conversion Rate Tips Sent to Your Email