Mastering the Design Process

Mastering the Design Process Post Launch Optimization Process

Learning Paths:
Topics:
Check out a free preview of the full Mastering the Design Process course:
The "Post Launch Optimization Process" Lesson is part of the full, Mastering the Design Process course featured in this preview video. Here's what you'd learn in this lesson:

Paul walks through post-launch optimization, including identifying dropout points, identifying the problems, testing solutions, and publishing those solutions. Testing solutions depend on the size of the change; small changes can use an AB testing tool, while significant changes may require a prototype of the new approach.

Get Unlimited Access Now

Transcript from the "Post Launch Optimization Process" Lesson

[00:00:00]
>> So let's look at the process that I use, cuz I get this is a big chunk of the work that I do is post-launch optimization work. So I'll just very briefly talk you through the process that I sell into my clients and I tend to do with them.

[00:00:16] It's very high tech and you may well have trouble following it. No, it's not, it's very straightforward. It's embarrassingly straightforward, actually, considering the amount that I get paid to do it. So, first of all, identify drop out pages, right? So where in your app, where in your website, bear in mind I tend to do conversion rate optimization on marketing websites.

[00:00:39] So apologies if I talk in that tone, the same principle applies. I find where things are going wrong, and that's step one. Then step two is identify the problem with that underperforming page. There's a particular place in the app or in the website or whatever where things are going wrong, but okay, what's the specific problem on that point?

[00:01:05] Then that hopefully triggers a load of potential solutions. I then test the solution, right? Mind blown at this amazing technique I have. Then I roll out that solution, right? And I go back to the beginning and start all over again, identify the next drop out point and rinse and repeat endlessly forever, right?

[00:01:29] So let's just run through that a bit more detailed so it makes sense. So I start almost always in Google Analytics, really. I'm basically just looking for pages that have got high bounce rate or a high exit page, right? So where people are going, screw this, I'm off, right?

[00:01:48] Those are the points that you've got. And, of course, you're gonna find lots of those. So that's your list that you're working through over time, you start with the ones with the highest bounce rate and the highest dropout rate and then just work your way down. So once I've found a page from that, I then shift across typically into that Microsoft Clarity that I talked about earlier.

[00:02:10] And I'm looking for certain things on the page. First of all, you gotta identify what the problem is. Is it the fact that the user is missing something, right? So are they missing something on the page? So what I'll look at is, say, the heat maps of the page to see where people's attention is going on the page, where they're looking.

[00:02:28] I look at the session recorders and I might even use my eye tracking software to see where people are looking and whether they might have overlooked something. So if it's not that, another common problem is, did people not understand something? So they've seen it, but they didn't understand what it is or how it works.

[00:02:49] So in that case I'm looking for things like rage clicks. You know when someone repeatedly clicks on something that's not clickable because they're annoyed? Or any click on a non-clickable item, they're clicking on an image where it's only the text underneath that's clickable, things like that. Or did they trigger some kind of validation error on a form?

[00:03:10] That's another common one that causes people to abandon, cuz they entered their data in the wrong way or whatever. And if it's not those, then the third area I'm often looking at is do we not convince them, right? So is it that what we're producing, our messaging is not convincing enough, is not compelling enough.

[00:03:31] And so I tend to use an exit intent survey, asking people why they didn't act, right, for my kind of marketing site. But it's normally one of those three reasons, basically, and you just look at different sources to identify it. And then you go, okay, so people aren't understanding this form field, or they missed this drop down menu, or whatever it is.

[00:03:54] Then okay, how do we fix that? I've got some ideas about how to fix that. And I come up with those different ideas, maybe even multiple versions. I tend to favor three or four at least different versions that you can have. It depends on how you're gonna test it but basically, yeah, I get a variety, and then I'll test it, right?

[00:04:17] Now, how I test will depend on size, basically, of what your solution is, right? So if it's just a little small solution, if you're making changes to color, text, layout, little things like that, then I tend to favor doing A/B testing, which is what we're talking about earlier.

[00:04:39] So with me I use VWO a lot of the time to do that. You could use Google Optimize that's free. It's up to you what you use. But basically, I'll run a test where I'll send a small portion of the audience, to see each one of the different solutions, right, that I've come up with.

[00:04:58] If it's a more substantial test, like I'm changing the whole flow of, say, a checkout system or something like that which has got multiple changes and they're quite complex, so it's a larger set of changes. That's when normally I end up prototyping a solution. And as we were discussing earlier, I tend to favor a facilitated usability testing to actually understand, well, if it did work better, why did it work better?

[00:05:27] Because the why is a really important aspect of it, because that allows you to further refine the idea even further. So, for example, I remember doing one test once where we replaced a VeriSign logo on an e-commerce site, you know where it says you're verified, signed secure, and we replaced that.

[00:05:50] I couldn't help but feeling that my audience didn't know what that meant and they were worried about security. So that was my hypothesis, if you like. And I then replaced that with just a padlock and some text, right? Little picture of padlock, bit of text saying secure. And when I did the usability testing, they would say, so that's so much better, I understand now that it's secure.

[00:06:17] And they actually started to give me some feedback about wording. It'd make even more sense if it said blah, blah, blah, blah, blah. Great, you don't get that from A/B testing, do you see what I mean? So just bear that in mind, depending on the size of the test that you're working on.

[00:06:33] And then like I said, it's just launch and repeat, basically. You fix one problem, you go on to the next, and that is really forever. [LAUGH] You just carry on working through the list, constantly refining and constantly improving. And that is fundamentally it with that post-launch optimization phase.

[00:06:57] It's not complicated, but it is where you make the real wins. It's small little changes that make an enormous difference to improving the experience. And you need to get in the habit of doing that on an ongoing continual basis cuz there's always things to improve. There's always things to fix because you improve one thing and you end up breaking another.

[00:07:21] Or you improve one area and it just exposes problems that users didn't get to further down in the journey. So really, right from the beginning, start talking about the importance of this kind of work cuz it is absolutely fundamental to your success.