UX Research & User Testing

Getting Access to Customers

Paul Boag

Paul Boag

Boagworld
UX Research & User Testing

Check out a free preview of the full UX Research & User Testing course

The "Getting Access to Customers" Lesson is part of the full, UX Research & User Testing course featured in this preview video. Here's what you'd learn in this lesson:

Paul addresses common objections and resistance to change when it comes to user research and testing. He discusses strategies for overcoming objections such as lack of access to the audience, limited sample size, biased results, and concerns about disruption and time constraints. He also provides practical tips and approaches to address these objections and emphasizes the importance of being resourceful and adaptable in conducting user research.

Preview
Close

Transcript from the "Getting Access to Customers" Lesson

[00:00:00]
>> I know that at some stage, probably a big majority of the people watching this have had a go at doing this and it's gone horribly wrong, right? They've met resistance, okay? They've come up against problems. People do tend to be resistant to change. And so if you are suggesting changing things, people will come up with all kinds of reasons why it can't be done.

[00:00:27]
So I thought I'd knock off a few of the most common ones, and how I tend to respond to those, I may or may not work in your organization, I can't make promises, it depends on the people, it depends on the circumstances but I'll give you a sense of how I deal with it.

[00:00:42]
So one of the big ones, have you ever had this one? So, we can't get access to our audience, have you ever had that one? Or we'd love to, we'd love to do user research and testing, but it's really hard to our customers, our clients are very busy or whatever else, right?

[00:01:00]
That's quite a common scenario that comes up. So first thing I would say is you can do user research just using existing assets a lot of the time, right? So there will be data, there will be anecdotes, there will be online comments. There'll be feedback that the audience has given over time.

[00:01:23]
So, often one of the excuses is, not that we can't get access to the audience, but we talk to them so much we can't ask them yet again about another thing. All right, well give me everything you've ever talked about with them before, and I'll have a look through it and see what we can learn, right?

[00:01:42]
So that's always my starting point with this one. There may be gaps, there may be information that you can't get out of that, right? It doesn't answer your questions, so the next thing I tend to do is use surrogates, right? By that I mean people who are engaged with your target audience.

[00:02:01]
So the best one, and I'm doing this right now, I'm working with a charity to increase their fundraising values. They have fundraisers and a lot of people don't get around to actually, they say, I want a fundraiser and they never actually do it. And so we're trying to work out why that is and how we can improve it.

[00:02:22]
But I'm meeting resistance in terms of speaking to those fundraisers because they're sending them so many texts, so many messages, they don't wanna interrupt them again. So what I've done is I sat down with the support staff, the people that are answering their queries and and talking to them in the chat rooms and people like that and saying, well, what are the issues, right?

[00:02:46]
Well, one of the issues turns out that you're sending them too many texts and too many emails and too many messages and you're annoying them. So you can talk to people who've got a close working relationship with the end users. Cuz a lot of the time organizational decisions being made by people that never spend any time with users, right?

[00:03:05]
So bringing the voices of those who do into the equation can work very well. Work with colleagues is what I've just described, that idea of working with people who are in a close working relationship with users. The surrogates one is actually more the people who are very similar to your target audience.

[00:03:25]
So let me give you an example with that one. Let's imagine you wanna do some usability testing, right? Now that doesn't mean in that situation you need your exact target audience. You probably don't, right? The truth is that, I don't know, you and me, for example, okay? We're different generations, we come from different countries, we're different genders, we're very different people, yet doing usability testing, we probably would have very similar results.

[00:04:01]
Because usability just cannot use it, right? We've got similar level of computer knowledge, for example. And we're confident, maybe my vision is causing me more of a problem than it is you so that might be a slight difference that would affect usability testing. But generally speaking, you could swap people out quite easily with usability testing.

[00:04:24]
While if it was design taste or tone of voice, then we would be very different over that, cuz that's a more subjective thing and is more based on demographics and that kind of stuff. So your three options are look at existing data, use surrogates, use, wear appropriate where you can get away with using someone who isn't your specific target audience and work with the colleagues that are actually interacting with people on a daily basis.

[00:04:54]
So that's one big objection that people have. Another big objection is, well, you haven't spoken to enough people, or we don't have access to enough people. And so you'll maybe do a little bit of testing and then people go, well, you only tested with three people, right? Acknowledge that speaking to more people would be better, and then you would happily do that if you were given the time and budget to do that.

[00:05:22]
However, emphasize that, even with a small number of people is better than making guesses and assumptions cuz that's the alternative, right? So really that this is a false premise here, that we haven't spoken to enough people. One person is better than making stuff up yourself. But this is actually a really useful one for actually demonstrating that, well, yeah, I'd love to test with more people, let's do it.

[00:05:58]
It's a good one to turn around. Another common objection that you'll meet is where your results might be biased, all right? No, no, yes, I know you did a little bit of testing but the results are gonna be biased. And so they dismiss it because they don't wanna face it.

[00:06:15]
So first of all, you can minimize the bias. You can by being very careful about the questions that you ask in a survey or in your user testing. And in fact, I often find AI is really good for this, right? These days, ChatGPT, if you take your question and you copy and paste it into ChatGPT or your large language Model of choice.

[00:06:41]
Other services are available. If you take your question, pop it into ChatGPT and say, make this as clear, concise, and unbiased as possible. It helps you remove bias very quickly and easily, right? So that's a great way of getting around that. But you can also, even if your results are biased, that's kind of okay.

[00:07:04]
It's okay as long as you're aware of those biases. So you've got to be with any usability testing, especially if you're doing lean, quick, and dirty usability testing, anything you do is gonna be a bit rubbish, right? It's gonna have its limitations. And so as long as you're aware of those limitations, that's fine, okay?

[00:07:28]
And then if people believe the results are biased by all means, let's do some more in-depth research. Again, you're upselling, the idea of let's do more and let's do it better. So a lot of these kinds of objections you get are actually opportunities to kinda push for something a bit more and a bit better.

[00:07:51]
But this is the biggie, this is the one you'll get the most. It'll be too disruptive and take too long, right? Get this all the time, we don't have time to do it or we don't have the budget to do it. We've got to get this done by the end of the week etc..

[00:08:10]
Any excuse just to keep things moving forward, especially project managers are the people that are really bad at this because they just wanna get it delivered, and I can understand that they're under enormous pressure. So, how do I get round this little one? Well, my best tip is, just do it.

[00:08:30]
Just don't tell anyone you're doing it, right? Don't give people the opportunity to say no. Just do it as part of your working practices. As the way you operate, right? So was it Grace Hooper, I think was the firs female admiral in the US Navy and amazing woman.

[00:08:52]
She came from a development computer science background and credible woman and she said a very famous, quote, which you'll all know is easier to ask for forgiveness than permission, right? That's what you need the usability testing. And when I show you the kind of usability testing I'm proposing, you can easily slip that under the radar without ever telling anybody you're doing it, right?

[00:09:19]
Don't ask, right? Avoid asking for anything, I know it's just said just do it, which sounds like don't ask. But what I'm talking about here is don't ask for anything. Don't ask for any money, don't ask for any extra time. Just do the research without any additional resources, all right?

[00:09:41]
Don't ask anybody else to change the way they work at least not to begin with. Because that's where you'll meet resistance. And so the extension of that is no extra time. Keep your research from impacting the timeline of the project, build around the timeline that already exists to begin with.

[00:09:59]
Now, I'm very aware that I'm making a lot of, you're going, well, how can I do that, right? But that's what we're gonna come on to. This is why this is the kind of foundational principles. I'm hoping I can show you opportunities to start doing some of these things within these constraints.

Learn Straight from the Experts Who Shape the Modern Web

  • In-depth Courses
  • Industry Leading Experts
  • Learning Paths
  • Live Interactive Workshops
Get Unlimited Access Now