The Product Design Process

Testing Live Apps

Paul Boag

Paul Boag

Boagworld
The Product Design Process

Check out a free preview of the full The Product Design Process course

The "Testing Live Apps" Lesson is part of the full, The Product Design Process course featured in this preview video. Here's what you'd learn in this lesson:

Paul discusses the importance of testing live apps and recommends using Microsoft Clarity for monitoring user behavior. He explains how to analyze analytics such as exit points, misclicks, rage clicks, excessive scrolling, and quick back to identify problem areas. Paul also mentions the use of heat maps and session recordings to gain further insights and suggests A-B testing for small fixes and prototyping for larger solutions.

Preview
Close

Transcript from the "Testing Live Apps" Lesson

[00:00:00]
>> Which brings me on to testing live apps. Once your app is up and running, really, now's the moment, people. This is the moment where actually testing is so, so important because you'll never get your app right first time, right? Setting aside MVPs and all the rest of it, you'll make a mess of it, you'll do something wrong, there'll be things that can be improved without a doubt.

[00:00:27]
So this is the point where you've really got a kick in and start really paying attention to what's going on. That's so much easier if you've got the right tools for the job. I highly recommend Microsoft clarity, and installing that, you probably never even heard of it, but it's a free version of Hotjar, right?

[00:00:50]
Very low impact on performance, which I like, and it gives me everything I need to monitor users on a live app, right? And by the way, it works with native apps as well, which obviously is important. So what do I do? Okay, so once I've got Microsoft clarity up and running, what I'm basically doing straightaway is I'm diving into clarity and I'm looking for certain things, right?

[00:01:16]
I'm paying attention to certain analytics, exit points. Where are people abandoning the app, okay? That's actually almost more useful when you're talking about an informational website, cuz on apps it's a bit more complicated. But you will be able to see that maybe people drop out from certain places where they're getting frustrated.

[00:01:38]
But where you've got the real good metrics are things like misclicks, where people are clicking on something that's not clickable or isn't currently active. So oftentimes a button that's disabled or they think they can click through to get more information on a link or on a piece of text where it's not really clickable, or all things like that, you'll suddenly see all of those, right?

[00:02:00]
Rage clicks is another one. [LAUGH] Where people just lose their shit and start going, no, no, no, no, you've got a big issue when you start seeing a lot of those on a particular screen. And then there's excessive scrolling, where people are going, if you roll that up, and they just shoot up and down a page when they're kind of, you reach a certain point when you've done that, so you've looked for what you're seeing.

[00:02:23]
You're not seeing it, so then you start going like that, right? Everybody does it, it's a weird thing that we do. So it's a good indication that you've got something wrong at that point. And then the final one is quick backs. So they go to a screen and then go immediately back, which means that obviously, they didn't find what they were looking for, or they didn't look like what they expected it to, or there's an issue.

[00:02:47]
So I start with that. And I look at those problem areas, and you can get them as heat maps and so you can look at those pages. Sometimes it is obvious, right? Yeah, they tried to click on something that's not clickable that really should have been a clickable thing, we'll make it clickable thing job done, right, yeah.

[00:03:07]
>> Well, this catch on unhandled exception errors or it just-
>> Yeah, I think it will actually. I didn't mention that cuz I'm not focusing so much on kind of debugging and things like that, but we should pick up that kind of thing as well, error messaging. I haven't used it, so I can't 100% promise that, but I think it does.

[00:03:30]
So yeah, there's sometimes you look at it and go, all right, so people are having problems at this point, but I'm not entirely sure what's going on here. So you can get additional clarification, heat maps, work really well, so you can now dig in and go, okay, well let's have a look at the scrolling on that page, let's look at what people are clicking on.

[00:03:50]
The heat map sometimes help, but the one that really helps is session recordings. So I watched back session recording, so let's say a particular screen, it's got loads of rage clicks going on it. So what I will do is, I'll filter my sessions by all of those that go to that screen and do rage clicks.

[00:04:12]
And then I'll watch sit and watch those videos back and see what people are doing, right, and where it's going wrong. Now, if I get really sudden 90% of the time, you'll then go, yeah, but doing that. [LAUGH] Yeah, we messed up. So, and you're like, okay, I know how to fix this, right?

[00:04:30]
The other 10% of the time, you guys, I still have no clue what's happening there. So that's when I would run some facilitated usability testing, right? Because then I can actually talk to that person and work out what's happening there, cuz obviously you can't talk to people over session recordings.

[00:04:47]
But most of the time you'll have an idea of what's gone wrong. So then we've got fixed what's gone wrong, right? And there's two ways of kind of testing those fixes depending on the size of the fix, one is AB testing. AB testing used to be so easy, there used to be Google Optimize, and you could just shove it on your site and you could do AB testing.

[00:05:13]
These days they've closed down Google Optimize, which is such a shame because it was free. And now there are lots of expensive things like convert, which cost like 300 plus dollars a month to run, but the AB testing is great, because basically, it gives you a WYSIWYG editor.

[00:05:37]
So if it's just little changes like I need to change the color of the button, or I need to move this image, or I need a bit of explanation text added. You could do all of that through a WYSIWYG editor and just run it with a percentage of users to see whether it improved things.

[00:05:53]
And if it did, then you can roll it out to everybody. There are alternatives Crazy Egg is a much cheaper platform, not quite as powerful, but does do AB testing. So if you wanna give it a go by all means that's one way to do it. We get into AB testing a bit more in the other course.

[00:06:14]
The other option is testing big solute solutions. So what do I mean by that? Well, sometimes it's not a matter of just changing a bit of text or changing a color on a button. Sometimes it's, yeah, we're gonna have to change the whole way that filters work here.

[00:06:32]
Or this really needs a wizard bit used doing it for you. And that kind of thing you can't easily do with AB testing, cuz with AB testing, you have to build it. And if you don't know whether your solution is gonna be a good one, you don't wanna spend all that money, building that thing for it to then not work, right?

[00:06:51]
So in those situations, it's back to prototyping again. We wireframe it up, we do some testing with some facilitated usability testing, see whether it seems to work for people. And if it does, then we build it and then we roll it out. So that's kind of how I handled testing, I went through that really fast because I wanted ample time for the last part, but there is an entire course on that if you so wish.

[00:07:19]
Just to kind of wrap that up, testing is absolutely essential to product design. If you're working in a company and they're trying that you're doing product design and you don't get to test, then you're not doing product design, right? I don't know what you're doing, but you're not doing that.

[00:07:38]
Testing should be done, in my opinion, at least often and early, right? Lots of little rounds of testing, testing very, very specific things that you're worried about or concerned about, not just doing it for the sake of it. And I would argue it can be carried out throughout the development cycle from initial wireframing to post launch iteration.

[00:07:59]
There's never a time when you won't be unsure about something, you won't disagree about something, you won't need clarification over something. So as long as those things are happening, there's an opportunity to run a little test. Are there any questions about testing, cuz I have gone through that so ridiculously fast?

[00:08:21]
And I recognize not everybody has seen the other course, so I'm happy to deal with any questions here, yeah,
>> Just a quick comment. In my last client, doing mobile is a little bit harder in terms of deployment cycles.
>> Yeah.
>> Because we have to go through the stores.

[00:08:40]
So Google or Apple are gonna say, hey, you did something wrong, then that brings us back to, we got to go fix it. So the way we did our testing was, whenever there was a new feature, the UX UI team would come to us and say, here's the flow, this is what we're playing for.

[00:08:54]
And then that's when we get all our input, and we break it down to small stories to incrementally get to that final feature. But before story got into a PR review,
>> Yeah.
>> The developer that worked on that story would showcase.
>> Yeah.
>> So we would do what we call an amigo review.

[00:09:12]
>> Yeah.
>> And then we would get stakeholders, UI UX teams, all you know, we didn't add enough padding here or something some,
>> Yeah.
>> So every time we did a story, we did very early amigo testing.
>> Yeah.
>> Got feedback from everybody, and that helped a lot because by the time we finally did push to the store everyone had their input.

[00:09:31]
So it came down to story, developers showed their work, got the feedback, and then went through PR review, and then it went through QA, and then we went through the product owners approval.
>> Where's the user in that? And that's the shortcut, everything else sounded great, right? You were engaging with stakeholders, you had a good working relationship between UX and development, but there is no interaction with the user in that process, right?

[00:10:01]
And that's the fundamental problem there. And I understand that you have this barrier this of the store. But that doesn't mean that, when the UX team, perhaps the UX team are doing it and you're just not aware, cuz you've been working on the development side. But when the UX team is creating that prototype, that prototype needs to be put in front of users to make sure that that's working.

[00:10:24]
>> I have to step back, I'm sorry, we did have users.
>> Right.
>> So we-
>> Cuz it's such a thorough process. So I was like, how can there not be users logged in that?
>> Before we were finally released to the general audience?
>> Yeah.
>> There's tools called Google Play store and TestFlight.

[00:10:41]
>> Yeah.
>> That's where you could say, hey, this is going really into production, but into a limit number of production.
>> And that's fine, and has its value, but you've fallen into that trap of all the testing being at the end when it's most expensive and most difficult to change it.

[00:11:00]
I would say that your UX team at the beginning, when they mock it up and prototype it, they should pass it via some users. It's not gonna be a final app or anything like that. But in Figma for crying out launch, there's an associated mobile app which allows you to display prototypes and tests with them.

[00:11:22]
Lookback I think works on mobile as well, so you can record the screen and do that kind of things. Or you could get people in, there's lots of ways of doing it. Or you could just do, even if all you did was a little bit of testing on some mock-up screens that were related to it, that's better than nothing.

[00:11:37]
So I would try and push some testing towards the beginning of the process, so that there's more flexibility to change. Cuz let's be honest, once it's gone through QA, once all the stakeholders have approved it, once the developer has built it, there's no way. Even if it's then sent out to test flight and everybody hates it, it's gone through so many rounds of getting internal approval that you're not gonna change it at that late date, it's too late.

[00:12:07]
So that's why testing earlier is always better, in my opinion, I keep adding my opinion.
>> I guess you're right, we really didn't incorporate the users at the early phases.
>> If so, and that's the trouble, that's a big problem, whenever I need to learn this actually, because I've fallen into this several times.

[00:12:26]
I talk about testing, and in my head, I'm always talking about user testing. But of course, if you're a developer, you're not talking about, you're talking about quality and bug testing and that kind of stuff. So different people use the same language to mean different things, and actually that's really dangerous.

[00:12:43]
So, yeah, user testing, testing with users is the bit that's often missing, cuz people will go, yeah, we do testing, but they're thinking of QA testing, right? The reason that I rushed through that section is cuz I didn't wanna spend a little bit more time on our next session, which is building design systems.

Learn Straight from the Experts Who Shape the Modern Web

  • In-depth Courses
  • Industry Leading Experts
  • Learning Paths
  • Live Interactive Workshops
Get Unlimited Access Now