Web UX Design for High Converting Websites Maximizing Conversion & AB Testing
Transcript from the "Maximizing Conversion & AB Testing" Lesson
>> The final place where you can do testing and optimization and by far the most important is actually once you go live, right? Once your website is live, that is the moment where you finally see real users interacting with the real website in real time in a completely natural environment, right?
[00:00:23] In other words, say they're not being watched or they've not got someone over their shoulder watching everything they do. They're just using the website. And that has got the most valuable insights appear there. And seriously, you could double or even triple your conversion rate by doing some post launch optimization.
[00:00:44] And it's a very simple step by step process, right? I've written in my notes, that sitting on the screen, hammer home this point, right? Of everything I've said today, this little loop that I'm gonna show you now, is the most valuable thing that you can do to improve the conversion rate on your existing site, right?
[00:01:06] Don't do a big redesign cuz it'll take forever and it will be really difficult. Instead go through this loop again and again, and you will systematically improve your website. So this loop consists of three steps. Step one, find a problem area to address. So we're gonna find an issue with the website, a page that is underperforming, okay?
[00:01:30] I'm gonna go into these in more detail in a minute. Step number two is then to diagnose exactly what the problem is on that page, okay? And then step number three is to test a number of potential solutions to that exact problem that we've identified, right? So let's break that down in a little bit more detail.
[00:01:53] Number one, identify problem areas with the page and how are we gonna do that? Well, we're gonna do it with all analytics. Now, a lot of you guys watching this are developers and probably are a lot more analytical in your thinking than I am. To be quite honest, I suck analytics, right?
[00:02:14] I open up something like Google Analytics and my eyes glaze over and I start dribbling at the corner of my mouth. It's not my natural environment. But even I am able to do the basic stuff that is required to identify problem pages on the website. Because effectively there are just two things you're looking for.
[00:02:33] So you can do this immediately after this session, right? Log into Google Analytics and look for two things. Thing number one is exit pages. Where are people leaving your website, okay? Where are they most abandoning it? So that's the first thing you're looking for. Just make a list of the top five or so pages that people are exiting the site from.
[00:02:58] And then secondly, look at your bounce pages. So these are pages where someone is going to the page and then within first few seconds, first 20 seconds, are leaving the page. And actually, you wanna take off people that are spending less than about five seconds on the page.
[00:03:19] Because those people aren't even really looking at the page properly. So in those situations, it's not necessarily the page that's wrong but they didn't mean to go click on the link or whatever. But people that are staying between say 5 and 20 seconds and then abandoning, that means there's probably a problem on the page.
[00:03:40] So it's pages where people are completely leaving the site and pages where they're not spending very long on a page. So those are the two things you wanna look at in your analytics. And that will give you a list of pages that are potential problem pages. And then you just pick the worst, right?
[00:03:58] You look at the one with the highest bounce rate or the highest exits and you start with that. That's gonna be the page we're gonna first of all improve and iterate upon. So once we've done that, we can then look specifically at that page. Now, earlier I talked about the Microsoft Clarity as a tool and we're gonna use that to narrow down the problem, right?
[00:04:21] So what we do is we run Microsoft Clarity, or if you've already got Hotjar, that will do the same on the page, on that particular page. You don't tend to have it running across the whole site because these tools begin to interfere with performance if you leave them running on everything.
[00:04:38] So we're just gonna use it for as long as we need it on that particular page. And that's gonna produce some session videos, right, of people interacting with that page. We're gonna see people scrolling up and down the page. We're gonna see people clicking on things, all that kind of stuff.
[00:04:57] Now, if you do, you can if you want, run it across the site more broadly. And then Microsoft Clarity will provide you with some analytics on things like number of people rage clicking or quickly going back from a page bouncing from a page and things like that. So you can always use it as a replacement for Google Analytics, but personally, I prefer Google Analytics.
[00:05:18] But what you're really looking for from Microsoft Clarity is you're looking for some sessions that you can play back and watch people moving around the page. And it'll also give you some heat maps. So I'll show you what people are scrolling to, what they're seeing on the page, what they're not.
[00:05:35] And so you can use this to kind of work out what may be going wrong with the page. So for example, people are scrolling past some critical piece of content without reading it. Are they trying to click on something that's not clickable? Are they missing a call to action entirely?
[00:05:50] That kind of stuff. So after watching a few of those videos and looking at the heat maps, you'll begin to have an inkling of what might be going wrong. And now hopefully inspire in you some potential solutions, right, how do you might fix it. If you really have watched those videos and you still got no clue of what's going wrong with a page.
[00:06:13] Get people to do some usability testing where you ask them to complete a task that involves going through that page, right, in order to see where things are going wrong. And ask them once they've been through the page, did you spot this piece of content? Or get them to speak out loud any problems you have and you'll be able to narrow down the problem.
[00:06:38] Once you've narrowed down the problem, the next stage is to come up with some solutions that hopefully will fix the problem and test it. So there's kind of how you test your solutions is somewhat dependent on what the solution involves. So there's basically two types of ideas you'll come up with.
[00:06:59] There'll be really quick and simple things that are things like, well, if we change the heading or if we rename the button or if we change the color or maybe if we swap out that image, basic stuff like that, right? And then there'll be more complex stuff which might mean involving some functional changes, like we are gonna remove capture from the form or that kind of thing.
[00:07:25] For the really simple stuff, the best solution you can use is you can run A/B testing. Now, if you've never run A/B testing before, it really, really is incredibly straight forward. Chances are you've already got it set up on your website and just don't know. Because if you've got Google Analytics, you're already set up with, they used to call it Google Optimize.
[00:07:48] But I think it's now got a new fancy name. But if you search Google Optimize, you'll come up with it. And basically, it's really, really simple if you've never done A/B testing before, based on the majority of your traffic will still go to your original version. Say the page with your original headline or whatever.
[00:08:09] And then you'd have various variations where you may be changing your page in slightly different ways. You might change the heading, you might have different headings that you test, that kind of stuff. I'm sure you're familiar with A/B testing. One area that does cause confusion is the difference between AB testing and multi-variant testing.
[00:08:27] A lot of people get that really confused. Because everybody thinks that multi-variant testing is when you test multiple versions. So this screen here we've got three variations, people think that's multi-variant but it's actually not. Multi-variant testing is where you're testing multiple variations. So you're changing multiple things across the page.
[00:08:52] So A/B testing, you could have you're making a single change. You might be changing the headline in five different ways, trying five different headlines. Multi-variant testing is where you're changing the headline and the bottom. You're changing more than one thing on a page at a time. And you can do all of this really simply just through a WYSIWYG editor, it can't be much easier to do.
[00:09:16] You just go in, you edit whatever it is you wanna edit with a different version, maybe a different headline or whatever. Save it, that becomes a variation job done. If you've got a big website with a lot of traffic going to it, A/B testing is an absolute must.
[00:09:33] And as you wanna explore as many different versions as possible. So don't just say you wanna change your headline, don't just do one old version of that alternative version of the headline, do 5 or 6. The reason being is the more versions you do, the more chances are you're gonna hit on one that creates a really big jump in conversion, right?
[00:09:57] Which is obviously what we want. The problem is that a lot of people don't have high traffic websites, have quite low traffic websites. And so they go, well, I can't do A/B testing then, right? Cuz you have to be Amazon or booking.com or someone like that, to do that kind of testing.
[00:10:18] And that isn't strictly true, you just need a slightly different approach to it. So one option for a start is you don't do lots of variations because every variation has to drive people to it. And that means that obviously fewer of people are going to any particular version.
[00:10:39] And that means it takes longer to declare a winner. So you wanna test maybe just one alternative if you've got a low traffic website. Also the problem that you can have is often there's a big gap between, this is really hard to explain, one day I'll workout how to explain it well.
[00:10:59] You'll have a big gap between the point you're testing and the point of conversion. So let's say for example you wanted to test variations of a product name. And whether changing the product name is gonna lead to more sales, right? Now, if you have got a low converting site, then so they're on the product page and they see an alternate version of the product name.
[00:11:34] Now, there's quite a long time, lots of pages where they could potentially drop out. So they might add it to their basket but then drop out the basket. They might go from the basket to the checkout but not enter their address information. They might dropout at the credit card conversion or any other number of points.
[00:11:52] And so as a result, you need a lot of traffic to be able to confirm that from enough people to get to the final step to know whether the new product name is working or not. So the way that you can get around this problem is by shortening the distance between the point of conversion and the page being tested.
[00:12:15] So instead of testing whether people, if you change the product name, do they get all the way to check out. Where they've actually bought it, you can test with did they add it to their basket. And that can be your conversion point instead. So it's much closer between the two.
[00:12:33] So how do you go about dealing with that? Well, that's one is to close the gap. The other issue just wait, right? So you wait till you've had enough traffic that goes to the site and eventually, Google will declare a winner. And you just have to wait longer, but who's got time for that?
[00:12:49] Second is you reduce the thresholds. So, in other words, you close, what you do is you go, well, okay, Google is saying it can't declare a statistical winner yet of which of the different options. But I can see from looking at it that that one's doing better. And so I'm gonna declare a winner myself rather than waiting for the stats.
[00:13:09] And then the third one is this idea of closing the gap to make the testing more reliable. Most of the time, to be honest on low testing websites, all just go in and go, well, look, that one's doing pretty well. I'm pretty confident that that's the winner, even if Google doesn't declare it the winner.
[00:13:28] That's normally good enough for me. So the, sorry, I shouldn't quite move on to that. So that's great for testing small little stuff like that. If you're trying to test bigger stuff, so maybe you're making changes to the checkout process as a whole or adding new functionality or something like that, then normally that's where you wanna do prototype.
[00:13:52] You wanna mock it up as a clickable prototype in something like Figma or something like that. It's relatively quick and easy compared to coding it. And then you just wanna run some basic usability testing on that and see how people respond. Because that you're not gonna be able to do with A/B testing.
[00:14:11] And so between those two types of testing, you can really begin to solve a problem. And then when you solve one problem, you found a solution that gets a higher conversion rate. You tick it off, you go back to your list of poor performing pages and you pick the next one.
[00:14:25] And then you rinse and repeat the process again and again and again. And that's how you improve conversion on your website. And you'll see bigger conversion rate by doing that than anything else that you could do.