UX Research & User Testing

Identifying Points of Friction

Paul Boag

Paul Boag

Boagworld
UX Research & User Testing

Check out a free preview of the full UX Research & User Testing course

The "Identifying Points of Friction" Lesson is part of the full, UX Research & User Testing course featured in this preview video. Here's what you'd learn in this lesson:

Paul walks through the process of identifying and resolving issues on problem pages to improve the user experience. He explains how to use analytics to identify problem pages through exit rates, bounce rates, rage clicks, dead clicks, excessive scrolling, and quickbacks. He also explains how to narrow down the exact issue by analyzing heat maps, session recordings, and utilizing A/B testing to test potential solutions on problem pages.

Preview
Close

Transcript from the "Identifying Points of Friction" Lesson

[00:00:00]
>> Stage one, identify points of friction, right? So where are things going wrong? Before you can improve the experience, you need to understand what pages are underperforming, right? Where are things going badly? And actually, there are certain warning signs are in your analytics, are really easy to look up.

[00:00:19]
So identifying problem pages is so easy. First of all, you're looking for exit pages where are people abandoning the website. Easy, look that up. No problem. Even I can do that in Google Analytics and I hate Google Analytics, especially GA4. We all hate GA4. And then the second one is bounce rates where people are bouncing from a page.

[00:00:47]
Those are the big boys. So if you've seen those pages that have got high bounce rate, high exit page, chances are something's going wrong on those pages, right? There are other signs you can look for if you've got something like Microsoft clarity running on it, which by the way is free and easy to set up and has a very low performance hit.

[00:01:09]
I've been really impressed because obviously the more stuff like this you add to a site, the more it slows things down, but actually it runs well. So other signs that clarity provides I often look at is rage clicks, right? Where people go, stupid thing, why is it not working?

[00:01:26]
And they're going like that. Dead clicks, where people are trying to click on something that isn't clickable. That's always a good sign. Excessive scrolling, when they go like that, right? That's a sign that they're not happy with the page, they're not looking at it properly. And then quick backs, which are very similar to bounce rates basically, I have gone to a page and then go is not yet gone back.

[00:01:47]
So there are all signs that that page is a problem, right? So you'll end up with a list of the pages that are problem pages and the ones with the highest exit rate and the ones with the highest whatever. She picked the one with the worst performing and then we have to narrow down the exact issue.

[00:02:06]
So what is wrong on that page, okay? Really simple processes is to go through. So, step one, you look at the heat maps. So whether it's in clarity as we've got here or where the hot jars are, I don't care, but you look at your heat maps to begin with, right?

[00:02:27]
Are people scrolling properly? Where are they clicking on the page? All those kinds of things you can all see on a heat map and you can look at that and often you can go, well, only 3% of people are clicking on the button on this page that we want people to click on.

[00:02:44]
Well, okay, why are they doing that? We're not picking, many people are scrolling that far down the page and you can just look at it and kind of get an idea of what's going on. Sometimes you sit there and go, I can't see the problem here. So what I'll then do is I'll then switch to session recordings.

[00:03:05]
So I'll switch across to the session recordings for people that have gone through that page, have a look at what they're doing. Are they doing anything weird? Is anything feeling wrong there? If I really get desperate, normally by that stage, you kinda got an idea of what it might be, right?

[00:03:22]
The heat maps show you scrolling, so you click and all the rest of it. You can only work it out. If you really get stuck, you could run a usability testing on that page, so you can actually ask people why they're finding it confusing, or you could run a survey on that page, if you click here, if you're getting frustrated and tell us why, there's ways and means.

[00:03:45]
So once you've found out the problem, then you're gonna come up with some solutions to the problem of how you think you can fix it. And it's beyond the scope of this to kind of tell you how to work that or out. But you come up with ideas of what you think might work on the page.

[00:04:06]
So they basically fall into two buckets, right? There's smaller changes, right? So sometimes you look at it and go, basically, if I just change this image or if I change this label or if I change the color of the button or something like that, then that will probably fix the problem, right?

[00:04:25]
So, those kinds of small changes, we can test whether or not those are gonna work with AB testing, right? AB testing is, it sounds really complicated. It really isn't. Basically, all you're doing is taking a percentage of your incoming traffic and you're sending it to variations of that page where those changes have been made.

[00:04:53]
So majority of traffic will still go to the original version, but you'll send 10% off to one variation, if you've come up with various ways of solving it, then 10% go to another and 10% to another and so on, right? So you can test as many versions as you like at the same time if you so wish to.

[00:05:12]
So when I primarily use AB testing is when I'm doing things like changing texts, really small textual change. Great, no problem, you can do that easily with an AB testing. When I'm changing buttons, works really good for that, and when I'm changing images, right? All of that can be done very easily with AB testing, right?

[00:05:41]
There's two types of AB testing, there's AB testing, and then there's multivariate testing. And this language gets confusing. But basically, it means that AB testing is where you're changing one thing on the page, so maybe just the text. Multivariate is you're changing multiple things on the page, so maybe the text, the button, and the image, right?

[00:06:04]
Whatever the case, it can easily do that kind of thing, no problem, making those kind of lightweight changes on a single page, easy. Now, it used to be that you could use Google Optimize to do this and it was great and it was free and the world was a wonderful place and it was all great and wonderful.

[00:06:27]
They closed it down for reasons that escaped me but you know fair enough.

Learn Straight from the Experts Who Shape the Modern Web

  • In-depth Courses
  • Industry Leading Experts
  • Learning Paths
  • Live Interactive Workshops
Get Unlimited Access Now