AJ Davis, founder, and CEO of US-based Experiment Zone interviewed on the Big Lift, the podcast of Webtrends Optimize. Webtrends Optimize is the CRM solution that enables marketers and developers to maximize the ROI on their digital properties.
In today’s podcast, we discussed how A/B testing/CRO has advanced, talked about how the market has matured, and whether the balance between data-driven and visitor empathy is right.
Experiment Zone helps companies identify opportunities and evaluate ideas using data so that they can improve customer experiences and increase sales. Schedule your free discovery call today, to get fast, expert advice focused on improving your conversion rate.
John Fleming 0:04
You’re listening to the big lift the podcast of Webtrends Optimize - the CRM solution that enables marketers and developers to maximize the ROI on their digital properties. Webtrends optimized is a powerful, feature-rich, and easy-to-use solution, all delivered within a fixed-price contract with no additional cost for increased functionality. Ever. During these podcasts, we meet some of the key influencers within the marketing and conversion world to understand their roles and examine their challenges.
Today, I’m in conversation with AJ Davis, founder, and CEO of US-based Experiment Zone. As one of the team AJ worked at Google during the conception and development of Google Optimize, so we discuss how A/B testing/CRO has advanced since then, we talk about how the market has matured, whether the balance between data-driven and visitor empathy is right. If businesses are attracting the right customers.
Hey, AJ, I noticed from your LinkedIn profile that you are a User Experience Researcher at Google, before they launched Google Optimize, it was around 2015 when it went to beat or something like that. What was it like during those early days? How did your own experience help drive the product forward?
AJ Davis 1:17
Yeah, I had a fantastic opportunity to be part of the Google Optimize team as their lead researcher, and user experience researcher. From the day it was in a hotel lobby where we were brainstorming ideas and user journeys and doing the full Google sprint week together. It was just in that ideation, what are those core problems we’re going to solve? What are some of the pain points that customers have, and all that fun discovery? Then I was part of all the research that went hand in hand for the product from that day until that product launch. That was a really fun journey to be on, lots of learning and hypothesis testing
John Fleming 1:58
Has been really exciting as well.
AJ Davis 2:01
I really enjoyed it. I love innovation, I really love iteration. AB testing really speaks to that as well as building a brand new product.
John Fleming 2:11
So did you yourself learn anything that you took forward to your future career?
AJ Davis 2:15
I think I basically learned that I was jealous of people who got to do optimization so I was in my career as a qualitative user experience researcher. What that means is I was running a lot of user testing, interviews, and usability studies, really getting to the nitty-gritty of how people use products, how they move through their lives, and what that intersection of technology and human looks like. It was very fun. It’s very energizing. But I had come from an economics degree and really loved hard data and being able to make decisions at that intersection.
With the research on Google Optimize, I was interviewing people who got to look at user experiences and then have a quantitative approach to testing out their hypotheses for whether those changes would really work or not in the real world.
I always felt as a user experience researcher, this lingering question of, yes, we saw it in the lab, yes, we learned about this behavior but does it hold up in the real world, and optimization and A/B testing really passes that bar of saying, we’ve learned about it, we’ve thought about it, we’ve really researched it upfront, and let’s validate it in the real world as well. I ended up transitioning careers after working on the product and becoming director of the optimization.
John Fleming 3:39
Do you think the data is the whole truth or just part of the truth? Because I think there are two camps out there: one set of people believe it’s part of it, and other people say no, that the data never lies.
AJ Davis 3:50
I think that data can be an important part of understanding the story of what’s happening. But there are lots of different data that could point you in the wrong direction. It’s about crafting a hypothesis that’s informed. Then knowing which data points would confirm or deny that hypothesis.
For example, if we only looked at a single metric, we might be misled. But if we could understand in aggregate all the changes that could happen as a result of the change, then we would feel much more confident in the signals we’re getting. I believe data is really powerful. I think it’s more important to think about what data or what questions to start with before you look at the data so that you don’t get misled.
John Fleming 4:36
That’s a really great answer. Because if that’s what I’d hoped you say, but other people just believe that it’s in the data, it must be true, but I know that not to be the case also. When you were working at Google, you were very much on the ground floor with regard to what Google was doing. But there are other products out there like Optimizely and Webtrends or something at that point as well, which is around an optimization product.
Do you think that the market was ripe at that time to bring in, say, Google with their free and the lots of other vendors that have joined since? Or do you think it’s still in its early infancy now,
AJ Davis 5:10
I certainly think that the industry is growing in recognition. It’s no longer this niche position. Somewhere in an analytics team, a lot of decisions, a lot of core business decisions are starting to look at experimentation programs to really inform higher-level strategy. I do think that we are pretty early on though. What I look for is, in addition to the product, maturity in the market, the variety of choices you have, and the variety of features you may have, I also look at what our clients are doing and where the titles are, and how experienced people are.
You see that companies want these skills and that they need these skills, but there’s not a corresponding degree program. It’s not clear who you hire, because we really are unicorns, you have to know data, you have to know user experience, you have to know development, you have to know analytics. There’s just a myriad of skills that have to come together around experimentation, and there’s not something that’s necessarily training people to cover all of those things.
The other thing is, my master’s degree is in Human Factors and Information Design, which is a really fancy way of saying User Experience Research. That degree when I was getting it was growing in popularity, now it’s widely recognized. I think we’re going to be similar in a place in just a few years where everyone knows about experimentation, and it’ll be a more standard path to it.
John Fleming 6:43
But it’s gained popularity, I think there are a lot of people that are talking about it now. But outside of the top, I’d say the top 1000 brands, those next tier and the tier below that aren’t quite getting to grips with it yet. I think there are some challenges around that. But do you think that there are enough experienced people out there to help companies beyond that top 1000 companies?
AJ Davis 7:08
I think the field is expanding, I do think you have to be very selective with who you work with. I think there are some trends that emerge where people end up with sort of a list of tests, and you just run the same tests with all your clients, versus a more strategic approach to understanding your audience and your business needs and your goal. That’s often something that surprises people when I’m first introducing them to conversion rate optimization, and how they could partner with us, is how personalized an approach that we take where it’s really thoughtful.
There’s a ton of work that happens before we even think about a specific test. There’s also a lot of expertise in how to set up the test and what metrics you need to collect. I think that we do have a shortage of people who are really talented in this area, but there’s an appreciation for it. There’s definitely a growing set of resources, and I do see smaller companies able to start hiring, even if they’re outsourcing for that talent to be able to benefit from zero.
John Fleming 8:20
Yeah, I think looking at the UK, there’s lots and lots of in the many hundreds of job roles that are being advertised for CRO type or user experience type expertise. I still feel that there’s a challenge in the marketplace that there aren’t enough people to fill those vacancies. Therefore, the agency approach, you know, one agency working with multiple organizations is probably still prevalent at this point. Whilst I think in the not too distant future, once the expertise gains more momentum, that will probably have somewhere in the region of a 50:50 split with regard to in-house versus out-house of expertise. But that’s just my own feeling. What’s it like in the States?
AJ Davis 8:59
I think that you’re onto something that’s generally too true as you’re able to scale things that you can start bringing some of the expertise in-house. One of the real challenges that we see in the States is this realization that it’s very hard-hire to bring one person in, versus an agency that already has the process. They have years and hundreds of tests under their belt for understanding what works and what people respond to. Plus, the expense of hiring developers, analysts, project managers, designers, and researchers, to start a CRO team is far greater than hiring an agency that already has all that skill in-house.
John Fleming 9:42
I’m going to take you back to when you started Experiment Zone there back in March 2017. Because that landscape must have been quite different at that point, it was very little expertise that was around. There was Google who was announcing their free tier product which must have been, I suppose, generating some hype, that A/B testing, as it was really known then, was actually going to launch at some very large accelerated viewpoint.
Now I go back to something, which is earlier in my career, we had the same thing with desktop publishing many, many years ago in the 80s, where everybody’s got to become an expert in desktop publishing, it never really took off in-house, it was still the agencies that look after that. What was it that stimulated you to go off and do your own thing?
AJ Davis 10:35
That’s a great question. One of the things that were a hypothesis I had, as a researcher at Google that has since changed, or I’ve learned and gathered some single on otherwise, is the idea that you know, A/B testing should be for everyone. The idea of a WYSIWYG where people can drag and drop features, make really smart test ideas, and have easy-to-read reports. I still believe that’s true.
Having worked with some major companies, also smaller companies that are just quickly growing and are trying to take advantage of that increase in traffic. It’s a really hard puzzle to solve, to figure out what to do with it, and to have that dedicated focus. I think one of the things that motivated me to have my own agency is this intersection of research and A/B testing.
Oftentimes, I’ve worked in both agencies that strictly do A/B testing and agencies that strictly do user experience research, they both have their own merits, but I love the intersection of the two. I love bringing them to companies who haven’t thought to put those pieces together. The reason that intersection is so important to me is that I have this experience working in the product development work world where you have the opportunity to learn, iterate and incorporate data that are both qualitative and quantitative, and I think it’s really hard for marketing teams to do.
For me, it’s just a really fun intersection. I love working with teams that have maybe tested a little bit or have tried some things out, but are really looking to level it up and learn not just about what works or not for their visitors becoming customers, but also things that can apply more broadly to their business strategy.
John Fleming 12:24
I think there’s a lot to do with UX and design and flow and things like that, which can only be measured really by testing different variants. I think that’s an area where there’s a hidden benefit with regard to CRO because not only can it deliver value increase in return on investment, but it also can help you get your site better to make less friction. This I suppose, is one way of describing it on your site, and therefore improving the overall customer experience, and that can be brought back into developing other areas of your site as well.
AJ Davis 12:57
One thing I would add to that is, what keeps me doing this? This is a different spin on what you asked, but what keeps me doing this is that I get to be wrong, and I get to learn this.
John Fleming 13:09
Nobody would admit to saying that to be wrong.
AJ Davis 13:12
But it’s so much fun if you just get it right every single time, then why are you testing, and it’s really great to build up a hypothesis, measure it in a really smart way, and then learn something really amazing, or learn about a really big opportunity for business and then build on it further. It can be, over the years, that very small changes can have these huge effects in one way or the other.
Often, I see businesses just changing things for freeform, or because someone had a good idea or because a competitor is doing it. I have little moments where I have to catch myself and not tell them to stop, because we’ve seen in measuring all those little things that you just don’t know sometimes what could happen. Oftentimes, things do align with our expectations, and we can still learn about customers. But in those odd cases where something unexpected happens, you can have a breakthrough moment that can change how you think about the customer and how to think about the site architecture and a whole bunch of other things.
John Fleming 14:20
I think one of the challenges is that many companies do a test and it proves to be successful. Then they don’t look at that again for another three to six months. They think they’ve done that, where actually I believe, and many others believe that it’s more iterative, that you do a test. You try it, then you keep revisiting that test to make sure it’s still optimized for the conditions that might change over that time. Do you believe the same?
AJ Davis 14:45
I think to build on that, I would say you even taking the same learning and applying it elsewhere on the site is an often missed opportunity. If you see reminding your customer that you have free shipping on the product page, maybe they need to be reminded elsewhere. Maybe that’s an effective message to incorporate elsewhere.
For us, we like to think about not only are we doing a test with a specific hypothesis, but we also build up what we call test themes where we identify an underlying principle that we’re trying to understand and learn about. Instead of just kind of pulling from random ideas across the year, we focus really deeply on test themes with our clients. Then we can walk away and say, we’ve really learned this, we know we’re maximizing the value of this type of theme.
John Fleming 15:32
Do you know when to walk away?
AJ Davis 15:35
Great question. I think the way that we think about it is that the roadmap is iterative, just like any individual test is. The way that we do it is we use qualitative research to help refocus us. It’s not so much that we’re walking away, it’s just that we may reprioritize. By understanding qualitative research, let’s take a usability test, you may surface three to five top priority problems, top problems that your customers are having on the site. You will want to address those with more urgency because the customer knows it’s a problem, they can articulate the problem, and you see them have the problem as they’re trying to achieve their goals on the site. It becomes more urgent to solve those issues. But certainly, we do revisit those test themes again, and again, as we work with our clients over the years, just some things ebb and flow in terms of their urgency.
John Fleming 16:29
Looking back at March 2017, it must have been quite challenging to get your first customers on board, because obviously, it’s very much in the infancy of CRO to be adopted at that point. How did you get your first customers? How do you get your first clients on board?
AJ Davis 16:46
Amazingly, my first client was my previous company. I had been a Senior Optimization Director at a company here in Austin called Clearhead, working with some top brands across the United States. I decided I wanted to start my own agency. I talked to them and said, look, I would love to keep working together, but I’m gonna go do my own thing. “Would you like to work together in this capacity?’’ It wasn’t a snap of a finger, certainly some great conversation about it, but I did start day one with having a full-time client.
Also, explore some other services, did a lot of usability research with some small and medium companies here in Austin, and kind of worked through what it was that we were going to offer and how we would be in this space. It’s definitely learning by trial. We really carry the same philosophy in our name, Experiment Zone, we really believe in the power of experimentation. We do that for ourselves and our clients. We’re always looking for ways to give ourselves a hypothesis and test against it and be ready to be comfortable with failure because that’s where we can learn and really find where we can add value and have a fun time in our space.
John Fleming 18:07
Do you believe that your clients are focused on finding the right customers? Or are they generally just happy to gain customers, whoever they are?
AJ Davis 18:14
I’d reframe it a little differently, I think that they’re eager to learn about who their visitors are, and what attributes their current customers have. If there’s enough similarity they’re pretty happy with that. But they want to be able to better connect with those that exist, and those that are there. I don’t often see customers say I want to go chase down a whole new vertical and a whole new customer set.
I think a lot of our customers are fairly mature and understanding and having identified some personas and ideas of what their ideal customer would look like, where I find our ideal clients to be our businesses that aren’t strictly checking the list on here are the demographic traits of our customer, or some may be more superficial attributes of customers. But they really want to understand what are their pain points on the site? What are their pain points in life? What are some of these bigger friction points so that they can better deliver a better product or a better service, so that in the long term, they have built up a stronger lifetime value of that customer, and they understand them and can maintain that relationship?
I think when it comes to optimizing a site, you’re looking to nudge people who might be on the fence or nudge people to the right information that they’re having trouble finding. It may be on the margins that you’re making the change today, but you can build up a broader understanding of what kind of stimuli your customers do respond to. Then when paired with qualitative research, you can build up a really strong understanding of what they need.
John Fleming 19:59
Do you think you might be in danger or your clients might be in danger of using that kind of hackneyed phrase. Amazon recommends by saying, our customer looked at this, also looked at that, do you think that’s an attitude that customers are looking at now trying to be able to cookie-cutter a process? Or do you think it’s much more personalized than that now?
AJ Davis 20:21
I think it’s a balancing act. And it’s a tough one. Because in one way, we want cookie cutter for some UX elements. We want your navigation to be recognizable, we want it to be findable. We want your search to work as people expect search to work. In some ways, we do want a very cookie-cutter interaction so that people don’t get confused by it.
On the flip side, if we over-index on what our competitors are doing, or what industry norms are, personalization has been such a buzzword, if done correctly, it can add value. But a lot of companies forget to measure what the impact of adding that is, whether it’s good or bad, increasing trust or not.
I think what’s more important is to start with the problems your customers or your visitors have, rather than just looking around the field and saying what latest technology can we add to our site, and instead try to solve pain points they have because that’s a way to build trust and loyalty. It’s a way to remove friction much sooner for that customer so that they can become repeat customers. The answer may or may not be some sort of recommendation that everyone else has, but it should be use-case-dependent. I think businesses should take that step back to say, why am I considering this solution?
John Fleming 21:54
How would you advise your clients to focus on the right set of customers, then? Is it just looking at the data?
AJ Davis 22:00
Good question. I think that we often start with a hypothesis of the customer. More often our clients have customers, and we can talk to those customers and help them understand or build that persona if they don’t have it already. Then within that, depending on what the research question is, you do need to build in a sense of who you’re trying to learn about. It could be we’re looking for mobile visitors who’ve purchased on our site before, and we want to get their experience and get their feedback on how that’s gone. It could be that we’re moving into an adjacent product line, and we want to talk to people who’ve purchased similar products, but from a competitor.
We can understand more discovery-type information from that audience. I think it is just difficult consulting answer, it depends, and it should depend. Because you don’t want to just be always asking the same people or you know, kind of blindly talking to a specific audience over and over, you do want to test the waters, but you don’t want to leave your core customers behind either.
John Fleming 23:11
Yeah, so it is a bit of a challenge to be able to make sure you’re keeping the plates spinning. It’s a phrase we use over here to make sure that you don’t have many of those. Being able to find out who your customers are, you then need to align CRO on top of that. Are there any metric uses for being able to understand who your customers are and what behaviors they’re doing? Or is it something that you just let flow, and you have a whole toolbox full of different metrics that come into play at different times.
AJ Davis 23:42
There are some typical customer journeys by industry. We could have some predictability for what kind of metrics we would want to do for those types of use cases. Certainly, we could talk about that. But I think a more holistic approach would be if you can borrow from the product world and say, here’s my hypothesis for who my user is, here are their problems and their goals, and then break those down into specific tasks, and then mapping out those tasks into a customer journey. As you mature through that lifecycle of developing your website or your experience, as its own product, or as its own experience, you get to a place where you do have a more expected path that customers will follow.
At each point in that path, there’ll be a page to see or a CTA to click on. Those become the customer journey metrics that we keep track of. We can understand where people enter into the site, how they generally move about the site when they’re browsing or learning, and then what they do when they ultimately find the product or find the service or the solution to then submit or purchase or you know, set share their email.
John Fleming 24:58
That’s a good one because you use the word empathy there. I’ve got a question here that I was going to ask anyway. You run a series of webinars “Growth through Empathy”, now, I think there’s a very subtle difference between empathy and understanding. I feel that lots and lots of brands try to understand their customers, but very few brands have empathy with their customers. Do you see a difference? And how do you get empathy into your brand’s awareness?
AJ Davis 25:29
To me, the distinction between empathy and understanding is, that you can understand someone or you can understand someone’s motives or how they make decisions. But empathy requires seeing things from “being in their shoes”, or I love to think of putting the hat on that person. Trying to understand what their environment is like, or what might be going on to give them the benefit of the doubt is a really important part of that versus understanding.
Empathy, to me, is the ability to remove yourself from the understanding and really focus on what’s motivating the customer, what struggles they have, both with your products and your experience, but even more broadly, why are they even on your site in the first place?. Something I see really often in new companies we work with is not being able to directly state from a customer perspective, the message that they’re looking for. You may sell really great planners, and the customer knows they’re looking for a planner, and that’s great.
A better way to show empathy and to recognize their situation is to recognize the chaos that’s in their world, or what their desires are. They’re looking for something concrete and focused and tangible in this world that’s very transient and chaotic. Starting to move into a place where you can understand what makes somebody tick, and what struggles they have gives you just a deeper layer of empathy of understanding of those people. How you can connect with them and create meaning through the products and services that you sell.
As soon as you can unlock that, and you know exactly what the problem is you’re solving, and who has that problem, you can go so much further as a business, but you’re also having a real impact on that audience, and for that group. Then other people can just leave it behind, you’re empathetic for people who can’t be your customer, because you’re so direct and saying, this is my customer, this is their problem. If you don’t have it, you’re saving that other person time and researching and figuring out if their problem can be solved with this particular solution.
John Fleming 27:58
But do you think empathy is directly opposed to data? Because one’s a very hard science, and the other one is a very soft science.
AJ Davis 28:09
I think it’s perfect. That’s exactly our intersection, being able to bridge that because, in so many organizations, it’s a separate practice of being empathetic versus understanding the data. But if you can apply a lens of empathy to data, or if you bring data to your understanding of your customer’s pain points, you’re just getting stronger and stronger signals and understanding of what’s happening for your customers and what their behaviors are. For example, if you saw a lot of drops off on a certain page, if you bring a more empathetic lens to it, you may ask, what struggle is that customer having? Why are they leaving? What is our miss here? Versus, we need to decrease the bounce rate and increase clicks here. It just shortcuts your ability to make a solution to it because you’re not seeing from the customer’s lens, why those things may be happening in the data.
John Fleming 29:08
Do you believe that CRO can demonstrate empathy?
AJ Davis 29:11
I think a really good CRO must include empathy. I think all business needs to include empathy, if I’m being honest. We build relationships like humankind is all about building connections and growing relationships and being a community and businesses function as part of that. If we’re doing business without understanding or empathy or curiosity about our customers, it’s not going to be a long-lasting business. You know to hyper-focus then on CRO we can’t be the frontline of the company in that moment of people considering, they’re visiting the site and figuring out if they want to be a customer, we can’t lack empathy at that moment, or we’ll miss a lot of opportunities.
John Fleming 30:07
It’s a great thought process that my late wife was a psychologist. She used the phrase “reasonably often”. This is what you’ve said, this is what I’ve heard. To me, that sums up quite a lot of the mistakes that websites are doing - is that they’re saying things, but people aren’t hearing what they’re saying, they’re hearing something different from what’s being said, because it may be the tone of voice, it may be a lack of empathy, it may be that they’re in the wrong part of the website, or whatever.
I think your mix between empathy and data is absolutely spot-on as a way of being able to move forward with CRO whilst the ROI is very deliverable and very hard focused and very much at the seat of the board. If you’d like to be able to justify CRM, I think there is a lot more than empathy that can show companies to understand what they’re trying to be able to do, rather than telling them, but the story tells them more as it was a phrase that was used a couple of years ago, to be able to use stories to get people to understand what the company believes in, rather than just the product.
It’s an interesting topic. I’d love to be able to pick this up at some point later in the time because I think it’s a subject on its own. I’d love to talk with you off the air with regard to this as well because I think it’s very interesting. But AJ, we’re running close to a two-hour 30-minute time slot on this. It’s been a really fascinating conversation thus far. As I said, I’d like to pick up on this again, at some later stage, whether that’s on or offline.
CRO protagonists still have a lot to learn. I think your methodology seems to be one of the better methodologies that I’ve heard for many months. I think it’s very interesting that people will start to dive a little bit deeper than just a return on investment. Thank you very much for your time so far. As I said, it’s been a great conversation, and thank you very much.
AJ Davis 32:09
Thank you so much for having me on. I really enjoyed this as well.
Our team of CRO experts will help you understand your customers and how they are interacting with your site, and we will assess your current experience and recommend the highest impact tests. Get a personalized quote for your website now!