Learn more about your customers by optimizing your website.
*This article makes mention of Google Optimize. Google has officially announced the sunset of Google Optimize. Both Google Optimize and Optimize 360 will sunset on September 30, 2023…. Read More
In this podcast, we discuss the importance of optimizing website conversions, how little changes can make a big difference, the importance of testing changes, and how metrics are important in building meaningful analytics.
Transcript
Eric Dickmann: 0:09
Welcome to season two of The Virtual CMO podcast. I’m your host, Eric Dickmann, founder of The Five Echelon Group. Our goal is to share strategies, tools, and tactics with fellow marketing professionals that you can use to impact the trajectory of your company’s marketing programs. We have candid conversations about what works, and what doesn’t, with marketing tactics, customer experience, design, and automation tools. Our goal is to provide value each week with a roster of thoughtful and informative guests engaged in a lively conversation. So with that, let’s introduce this week’s guest and dive into another conversation with The Virtual CMO.
Today, I’m delighted to welcome AJ Davis to the show. AJ is an industry expert in user experience strategy with a proven track record for delivering measurable value to clients. She’s lead optimization strategy for fortune 500 companies during her tenure at Clearhead, including CVS, Steve Madden, and Lululemon. She’s the founder of The Experiment Zone, which provides conversion strategy and testing for online businesses. She was also the lead UX researcher on the Google Optimize* product. I’m excited to welcome AJ Davis to the program. AJ Davis, welcome to the virtual CMO podcast. I’m glad you could join us here today.
AJ Davis: 1:34
Eric. Thanks so much for having me glad. to be here.
Eric Dickmann: 1:37
AJ was wondering if you could give the audience a little bit of information on your background. I know you spent some time at Google working on their optimized product. Tell me a little bit about that experience, what that was, and how that led you to found the experiment zone.
AJ Davis: 1:52
I have the pleasure to be part of the team from day one, from Google Optimize*. So Google pulls everyone together to do these innovation projects. And we had a chance to spend one week doing a sprint. We were in a hotel working with people we don’t normally work with. And from day one, when there was a team really thinking about what this product could look like, it wasn’t named yet. It wasn’t Google Optimize* yet. But the problem was there that we wanted to solve, which was. To enable all levels of marketing to be able to do this information gathering so they can split tasks and really understand which experience is better for them. So I was a user researcher at Google and I had the opportunity to stay on that team and do use the research from that day one to product launch. And I had an opportunity to talk to. Thousands of people who do conversion rate optimization. And I was so leaned in like as a researcher, you’re not supposed to be leaned out and you’re supposed to be neutral, but I was just like on edge, wanting to eat up everything they were saying and learn what they were doing. And so I had in that home, I had an aha moment where I decided that I was a little jealous of these people. I was talking to, they got to improve experiences, which was where I was coming from. Let’s focus on the user, understanding what they needed. And then they got to use data and really validate in the real world, but it works. So I. pulled off the bandaid of Google, which was hard and, A lot of, a lot of funny looks as I walked out that door. but how to, you know, really, I was really glad that I made that move and came out and decided to build a business that still kept. The user really centric to what we do, but could also validate and confirm the changes are truly impactful.
Eric Dickmann: 3:31
So when a client seeks out your company’s services, what are typically some of the problems that they’re experiencing? What leads them to come to find you?
AJ Davis: 3:42
Great question. I think there are two different types of reasons. There’s often just the data I have these goals I need to meet. I have a very fixed budget. I need to spend this much on ads and get this much in revenue. And how do you close that gap while you can hold traffic equal and you can improve your conversion rate? So there’s the data-minded objective, and that’s a perfect solution for conversion rate optimization. And then you have other businesses that are saying we’re getting feedback from our customers that our website’s hard to use, or people seem to be getting stuck and lost and we’re not really sure why things are happening. And so we also have people coming from the desire of wanting to prove better experiences so that their customers are not just having good experiences, but might even be delighted by the experience they’re having with that business.
Eric Dickmann: 4:28
Do you think businesses spend too much time focusing on the wrong things when it comes to their website design? What do you see marketing teams doing wrong there?
AJ Davis: 4:39
Yeah. You know, we often come in because a company has redesigned its website. And so they say, Oh, our website’s old. I won’t say it’s four years old, our websites, 10 years old, you. It’s just people who don’t like old. And what actually happens is you do this effort and you launched this brand new thing and you’re starting almost from zero. So you’re taking a really big risk and you don’t have a plan for how to make sure you mitigate that risk and then reduce it over time. So, what I see is that companies are focused on how the website looks, and they’re not necessarily focused on what does this mean for the user? Like what are the little, oftentimes it’s the various small things that add up to whether or not a user sticks around and ultimately becomes a customer as opposed to doing we have the most beautiful graphics is our website the most modern.
Eric Dickmann: 5:30
So trends come and go there are fads in design, whether it’s coloring, it’s fonts, it’s graphics that style’s whatnot. Do you see too much sameness in website design? Are too many people trying to use a cookie-cutter approach to their web presence?
AJ Davis: 5:49
I think on the design side, that may be the case. but from a conversion perspective, what I often see the mistake being is that people look to their competitors as an example, and they assume it will work with their customer set. And the real opportunity is to say what’s unique about my customers. What do they really need? Because we have, the privilege of sitting in a position where we get to pass things on many, many different clients sites. And we build up an understanding of what generally works. But we will test the same thing on multiple sites and see different results because customers are different. And so the biggest mistake we see is sameness due to thinking that. Hey, Amazon tested it or Hey my competitor, test it. So it must be good enough. And we should just copy it.
Eric Dickmann: 6:35
That’s actually a great segue to really drill down into this a little bit more. So what exactly is conversion rate optimization if you had to define it or experience optimization? And what is the process that you use with customers to help them convert more leads?
AJ Davis: 6:52
Great question. conversion rate optimization simply at the most basic level is the idea of taking the number of visitors you have and getting more of them to do the thing you want them to do, whether that’s to order something, to sign up for your email list, maybe you to submit a form. And the process of conversion rate optimization is taking in various inputs. So you’re saying, Hey, where are people actually going on our website? You know, what are some of the pain points they have when they’re moving across the site and then understanding the whole complete picture of what’s happening today, and then working towards a better tomorrow through AB testing. So for sites with enough traffic, we’ll split the experience. So we’ll say we should make this change on the site. Let’s do half of the sitters. We’ll see one experience. Half of them will see this new proposed better experience. And then we’ll truly validate that one’s better than the other. So the overall process starts with that discovery piece, understanding the context, building out a set of test ideas, evaluating them, and then ultimately measuring them.
Eric Dickmann: 7:54
So with what your firm focuses on, do you focus just on the AB testing of website pages? Or is it website, landing pages, emails, media posts. How far does that AB testing go?
AJ Davis: 8:09
We do the post-click test. So we’re really focused on understanding once a visitor has clicked through the email or clicked through the ad, and now they’re on your website. How can we optimize the experience top to bottom from whether it’s a landing page, your home page, the product pages all the way through the conversion funnel? So that all those moments on the website are truly refined and better meet your customers with what they need.
Eric Dickmann: 8:33
As a researcher are you surprised sometimes with the changes that get made that really moved the needle? Is it as simple as sometimes the headline is one thing versus another, and that completely changes the result? Or do you find that it’s usually more of a drastic change that impacts the results?
AJ Davis: 8:55
I’ll tell you a story about this one. Both of the time, the answer is like it. Sometimes things will surprise you or little things we’ll have a bigger impact, bigger things. We’ll have a smaller impact than you might expect. But sometimes a different variable changes altogether. So we ran a task for a client where we were trying to drive more email signups. And what ended up happening was we drove more orders on their site. And the way it happened was the placement of the email signup would flip between the email form and a free shipping message. And so as visitors moved across the site, the content moved and changed on that part of the page. And so the theory of what we understood to have happened was that because there was this exposed field people were looking up there and it was missing. And so they looked up there again and now they were seeing this free shipping message and feeling very motivated by it. So we drove email signups. So we also drove her more orders. And so as a researcher, I would never have anticipated that but once we understood the data, we had this aha moment of like, wow, how can we leverage that in other ways? So I think that’s really, the power of testing is things can move and change that you weren’t expecting. And by measuring it and really understanding the whole ecosystem’s impact, you ultimately just build a better understanding of how people work.
Eric Dickmann: 10:13
A lot of the audience for our podcast here are small to midsize businesses. When you were doing A B testing, what do you feel is a minimum threshold of users, of visitors that you have to have in order to really get a meaningful statistic?
AJ Davis: 10:32
That’s a good question. And this is something that’s highly discussed in our field. So the rule of thumb generally is 100,000 visitors a month, which is not a very high threshold. And that is something that will give you. That’s the amount of traffic that will give you the ability to change very small things and understand the impact on orders. But there’s a threshold below that, you know, in the 30 to 100,000 visitors a month range where you can still be doing testing, but you just have to do a couple of things to adjust it. So, if you can make bigger changes, you’re going to be able to detect them with less traffic. And if you test the places where people are more likely to see them, the top of the page, the home page. the product page is bigger locations with more traffic. That’s also something that can be more impactful, but you have to really think about those other variables that can impact being able to detect conversion rate changes. For people below or companies below the 30,000 range, we can use qualitative methods, triangulated with analytics to really understand the impact of some of these changes. and sometimes you still want to AB test in that range, even if you’re not reaching that 95% confidence. So it’s going to be really a range of things as you move across that traffic threshold per month.
Eric Dickmann: 11:47
I guess a followup question to that would be I think of my iPhone. And every couple of years they do a significant update that completely changes the user interface. And for a lot of people, when they first get that update, it’s like, oh, I don’t know if I like this. You know, I like the old way of doing things. And I think about companies that redesign their website. And especially if they have a strong existing user base. They come to the website. Oh, things are different. They’re not what I’m used to. But then I think of my own personal experience with the iPhone example when I go back and I look at the previous version I don’t know why I liked it. It seems so old. It seems so out of style, there is a learning curve and there is a time of adjustment. So when you’re making these big changes, how do you know when you’ve made a change, that’s sort of fatal, that’s really going to hurt your business versus having something that is just an adjustment for your customers and prospects to get used to, and then they will embrace it more? Does that make sense?
AJ Davis: 12:51
Totally makes sense. And, you know, I think when we talk about changes on websites and even a big change on a website, we might just mean like the placement of the button is now. More prominent positions or, the colors have changed or the way we talk about something has changed. And generally, the changes that we’re testing visitors, won’t notice. So they’re not going to notice if they came there yesterday and now they’re here today with this new experience, they may not realize that anything’s changed. There are much bigger changes like site redesigns, which you do need to consider that longer-term impact. Right? You need to understand, potentially this resistance that might happen at the beginning, and then understanding how to transition people to it. But I think ultimately a lot of people and a lot of businesses are very optimistic with that. They’ll have that Apple experience. Like they think that they’ve made the right bets. And so they’re going to get customers used to it and they’re going to be happy and satisfied. And if you don’t talk to your customers and if you don’t measure it, you’re not going to know if you’ve done it or not. So you may end up tanking everything just because you think you’re the smartest person in the room and don’t need to measure anything.
Eric Dickmann: 13:58
That actually brings up the topic of surveys One of the things that I personally find is that surveys seem almost designed by many businesses to satisfy their own metrics. So they want you to rate the call center agent a five because if they’re rated a five they get a bonus. Or the car dealer wants you to say that they’re great because then they get something from the manufacturer, but it’s not really designed to elicit feedback that can ultimately help the business. It’s more of a Pat on the back kind of thing. Do you find that businesses survey the wrong way? That they’re not really getting meaningful results from the surveys that they use?
AJ Davis: 14:39
I love it. You said this, I love your description of this because I feel the same way when you get those surveys and you’re like, how am I helping you? What am I doing by providing us information? In anything we do, if we can just put ourselves in the user’s chair and understand like what they might be thinking or how they might be perceiving the questions. I don’t want to be asking someone for feedback. I don’t need to know all their demographic information if I’m just trying to understand why are they leaving the website? So there are two types of surveys are two types of qualitative approaches that we like to think about. One is just something that you should measure over time for changes. So sometimes like sentiment score is used for that. You know, how likely are you to be recommended to recommend this business? And so it can be useful to the business and actionable. If you see a major change in that. The other thing that we like to do. And I think this is the real opportunity for most businesses is to know what you’re going to do with the data and to target research with that in mind. So first saying, Hey, I think we’re seeing a major drop off between cart and checkout. Can we figure out what’s going on here? Let’s do a targeted survey to people on these pages to answer that question. And then structuring the questions in such a way that you can get more open-ended feedback and really uncover the problem. As opposed to patting yourself on the back and say, Nope, no problem here. Good enough. We’re happy with our conversion.
Eric Dickmann: 16:01
It’s terrible. There’s so much padding on the back. That’s going on. Even these phone calls now that you have, where they ask you ahead of the service interaction, whether you want to take the survey. It just is too much. I’ve got a question around some tools that savvy marketers may or may not use on their website. There are some things like Hotjar lucky orange things where you can do the heat maps on your website to see the interaction. Now, my understanding of what you’re doing is it’s much more specific than just seeing what people are clicking on your reality, this is really a behavioral study. Talk to me a little bit about the difference between what a tool like that can provide a business and what your firm does that, that expands upon that.
AJ Davis: 16:48
Yeah. Yeah. We talked to a lot of people who have these tools on their website and they get them up and running and then they just have them running in the background or they collect some heat maps and then they don’t really know what to do with them. So either there’s this overload of data where they’ve got like a tidal wave of customer data and not, not really sure what to do with it. Or are they kind of dabble in it in, or not really? They didn’t see the value. Right. What we do is we do targeted collection of the data. So we understand like a baseline of what people are doing so we can understand potential pitfalls or moments where we need to really dive in and explore. And then we’ll go deep and we’ll use tools like Hotjar crazy. We’ll use it. Yeah, that’s right. Hotjar crazy. Egg. to really dive in and see what’s happening. So for example, we had a customer that was having some issues with our checkout. And we looked at the analytics. There wasn’t a clear moment when they were dropping out. We weren’t really clearly understanding anything visually with the UX audit. And so we went through Hotjar and did recordings, just so the checkout and being covered for certain device types. The screen broke and it didn’t break in the sense that it would fire an error that the dev team would find, but it made it impossible. It was just like get stuck and people couldn’t zoom in and out and type in the fields. And it just became this really full friction experience. So when you think about those tools, from the perspective of what are we trying to learn? Just like with the survey. Then you can really get value from them, but if you’re just coming into them thinking like I’m checking a box, I have a heat map from it running. Now I know people scroll on my page. You would have known that even without the tool. So make sure you know, what you’re using the tools for, and you can get that value from them.
Eric Dickmann: 18:31
Marketing is the engine that drives demand, but too often it takes a back seat to other priorities. Awareness fails to materialize demand drops in sales falter. Don’t wait until it’s too late to build your brand awareness and demand generation programs. If your company is struggling with their marketing strategy, we want to help let’s schedule a call to talk about your unique situation and what options might be available to get your marketing program back on track. To learn more text C M O two (407) 374-3670 that’s C M O two four zero seven. Three seven four three six seven zero. And we’ll reply with further details. We hope to hear from you soon. Today, the web tools that we have are so advanced. And we’ve got such interesting website design capabilities. We’ve got parallax scrolling, we’ve got button animations, we’ve got embedded video and moving backgrounds and all of these kinds of things. Do you see that there’s just too much distraction in many websites that people are enamored with all these fun little tools and animations, but they’re missing the point of really driving the focus for their customers?
AJ Davis: 19:49
Yes. 100%. it’s, it’s so easy to, to see what other companies are doing and thinking that that’s important or something we have to be doing. There are all these like fads that come through and we think we need to design the best widgets. We need to have the most visually stimulating things for people to feel like they’re in a dressing room. but if we, again, don’t know who our customers are and why that adds to their experience, it’s not going to, it’s either going to be a distraction or it’s going to be a detraction. Like it’s going to either. Take people away from actually buying stuff, because this is a fun tool to use. or just not going to meet them with what they need and they’ll find a different brand. That’s happy to give that to them. So honestly, the, one of the starting places that we do with brands as we go through their website and we find places to remove things, we start with removal instead of addition, These it’s very easy to think. I need to add these tools. I need to add these widgets. Let me have some more imagery and just pile it on when all your customer really needs is to know what you sell, how to get it, how to add it to cart, how to order it.
Eric Dickmann: 20:53
I think that’s great advice because there is a complexity and also slows the load of a lot of these pages down. When you start to add too much complexity in there, sometimes simple and streamlined is going to be the most efficient, which Google then rewards you also in search. I’ve got a question about data. So data is so important to really analyze the performance of your marketing efforts. Do you think that most businesses have the data that they need, or they even know what to do with it to analyze if they’re being successful with their onsite marketing performance?
AJ Davis: 21:27
I think a lot of small businesses struggle to start and have the right data. So when we talk with small businesses, sometimes we hear that they’re afraid to look at their analytics. They maybe don’t even have the tag on yet, or they’re depending on. The ecommerce stores data solely instead of a more complex platform that can really give you the customer behavior insights for medium and larger businesses who may have some of that groundwork laid. They have the opposite problem where they just have. A ton of data, and then they’re just pulling reports and the reports get automatically emailed and then nobody’s looking at it and he’ll be taking action on it. So I think for, in both cases, you just need to understand, like, what are your objectives, and what are the things that are going to help you understand? Is there something new happening? So finding those anomalies and then being able to address them. And then being able to do targeted data exploration so that you can really understand when problems start. If you can dive deep, when problems surface, you can dive deep and really understand why it’s happening and what the opportunity is to fix it.
Eric Dickmann: 22:28
Obviously the conversion rate is an important metric right there, but are there a few specific metrics that you really like to focus on when you’re working with businesses to get that underlying data collected so that you can measure the performance of what’s going on with your conversion?
AJ Davis: 22:44
Great question. We do this actually by a test by test basis. So when we first start, we do a lay of the land and we understand, you know, where people are coming in, what, where they’re dropping off in that funnel. and then when we go look at specific tasks, we’ll look at three different types of metrics. We’ll say, what does, what does this test targeting? What’s the most immediate change we’re making? So for some tests, that’s clicks to the add to cart button or visits to a product detail page. And it’s not that order. It’s the thing that’s closest to the change that we made. And then you have those conversion metrics. So anything that happens after the product detail page view, you look at the add to cart button. You look at checkout starts. You look at, start filling out the form on checkout. You can look at all those different moments to see what the impact was of that initial change. And then you look at the things that are around it that really helps illustrate why or why and what happened because of that change. So, if we were to change something on a product listing page, let’s say. we wanted to change the filters. We would want to look at how does change the filter’s impact, where people click on the page. If people have better filters, they use this search list. Like, are there things around that variable that we also want to track so that we can really understand, Hey, by driving more attention to this element, and we drove away from these other things that may actually be more impactful to conversion. So that we consider ourselves up to better understand overall what’s most important for customers.
Eric Dickmann: 24:07
When you present this data back and your recommendations on changes, do you find yourself fighting sometimes with pride of authorship bias where people will fight the recommendations because they say you don’t understand their customers, or you don’t understand what they were trying to do with a specific design or are people pretty open and receptive to it?
AJ Davis: 24:28
I think with the data they’re open and receptive. I think that the conversations sometimes get challenged and what should the solution be that we test? And I think it’s a really good thing to actually have a good. Party conversation about, because what you want to do is identify why you’re testing something and what your objective is before you get to the solution. So oftentimes you’ll get. I hate the filters we needed to lead them, but like, we need to know why you want to do that. What’s the potential behavioral change and what are some other solutions that we could do to elicit that change? Maybe we could move the placement. We could redesign them. We could make them sticky. Like there’s, there’s a handful of solutions from a design perspective that would potentially achieve the same goal. So, what I like to do is spend the conversation in that let’s agree and align on what we’re, what our objective is. And then we can explore different tactics that we can evaluate. And like the best thing about testing is that you can test one of those things you can test the next day. You can keep working through that list until you find solutions that are really gonna move the needle for your customers.
Eric Dickmann: 25:29
What I like about what you’re describing is that it sounds like a process, a collaboration. It’s not that you just spit something into an algorithm outcome. The recommendation that you deliver in a PowerPoint presentation to management. It sounds like a process that takes place over a series of months. You collaborate, you get some results, you make some recommendations, and then you analyze to see whether those recommendations were effective. Is that a good summary of how it works?
AJ Davis: 25:55
That’s a perfect summary. We love the collaboration aspect of the work we do because we believe that our customers, they, they, those businesses no so much more than we know about their specific industry or their specific audience. And we know a lot about testing and how to achieve the business goals. And so by talking regularly in collaborating throughout that process, the whole thing can just be raised up rather than us just being a black box and saying, Hey, we did it. Here are some outcomes. So we do weekly or biweekly meetings with our customers just to keep that conversation going, have that full transparency into what we’re working on, and what we have next in our test library.
Eric Dickmann: 26:35
I think why I enjoy this conversation so much is because marketing is expensive. It costs a lot of money and it takes a lot of effort to get people to come to your website and spending a little bit of extra time and money. To convert a higher number of those is. A really good investment. Because it costs so much more to get them to the website in the first place. And so I think what you’re doing here is great. And I think so many businesses could really take advantage of conversion rate optimization because there’s just so much value in it. And I know that you’ve put together some things don’t, you have an upcoming webinar series that you’re doing, that our listeners could take advantage of.
AJ Davis: 27:16
Yeah, we actually just launched a new webinar series. That’s called growth through empathy, empathy, and it’s a marketing webinar series where we look at specific brand websites and get really specific with that brand on the call with us about tactics they can take from marketing and from a conversion perspective. So that’s, each month and visitors can go to growththroughempathy.com. If they want to find out some more information on that. And we’re also offering a free conversion tear down. So if your, if your listeners are interested in getting our feedback on a page on their website, they can go to experimentzone.com and sign up for that. There’s no time required, they just fill it out and we’ll send them an email with a screenshot and send information and recommendations for them.
Eric Dickmann: 27:57
Sounds really interesting. And I know I personally want to go check out that growth for empathy site and really see what you’re doing there because this is a fascinating conversation. AJ, I really appreciate you being here today and walking through this. I think there’s some great value. I will have all of this as well as things in the show notes where people can get ahold of you, but this has been a great conversation. Thank you so much for being on the show today.
AJ Davis: 28:18
Thanks so much for having me. I really enjoyed it.
Eric Dickmann: 28:21
that wraps up another episode of The Virtual CMO podcast. As a reminder, if you’d like to learn more about Virtual CMO, strategic marketing consulting services, or anything else discussed here today, please visit us at fiveechelon.com. There’s a link in the show notes. If you’d like to send us comments, feedback, guest inquiries, and your five-star reviews on Apple Podcasts are always appreciated. If you’d like to reach me. I’m EDickmann. That’s E D I C K M A N N on Twitter. If you’d like to connect on LinkedIn, please let me know. You heard about me through The Virtual CMO podcast. I look forward to talking with you again next week and sharing some new marketing insights on The Virtual CMO.