The Product Design Process

Testing Prototypes and beta Builds

Paul Boag

Paul Boag

Boagworld
The Product Design Process

Check out a free preview of the full The Product Design Process course

The "Testing Prototypes and beta Builds" Lesson is part of the full, The Product Design Process course featured in this preview video. Here's what you'd learn in this lesson:

Paul discusses different methods of testing in product design, specifically focusing on mockups and prototypes. He mentions using tools like RealEye, Attention Insights, Lyssna, Maze, and LookBack for testing purposes. He also provides tips for conducting facilitated usability testing and emphasizes the importance of having clear objectives and tasks, remaining neutral, and debriefing afterward.

Preview
Close

Transcript from the "Testing Prototypes and beta Builds" Lesson

[00:00:00]
>> Prototypes on the other hand are a big part of it. So testing both in prototypes and the beta build, this is the time where you probably wanna do a little bit more in depth testing in terms of more qualitative testing cuz we've done a lot of surveying so far and a lot of data-driven stuff.

[00:00:20]
But at some point, we wanna just sanity check it with real people in a more qualitative point of view. So this is where I tend to introduce unfacilitated testing. I'm a great fan of unfacilitated testing cuz it's quick, but you get real people looking at it. So instead of you sitting in every test session having to talk people through it and all the rest of it, essentially all you're doing is giving them a task, a series of tasks to do, asking them to speak out loud while they do it and they're recorded doing the process.

[00:00:52]
So again, a tool like Lyssna or Maze do a great job at this, and it's pretty simple to set up if I'm honest with you, it's just a matter of going in and setting each question saying where the correct location is that they need to go to. And then they're taught through it.

[00:01:13]
So as you can see an example one running on my own website here, where they're given the task and what the task is, etc. And then they just click and complete the task. And you get results back at the end of the day, which was working well, which wasn't working so well, etc.

[00:01:34]
So that is I'm facilitated testing worked really well. But the downside of Lyssna in particular, is that it gives you a load of statistics about how well someone completed it, but doesn't let you watch back the sessions, which strikes me as a really weird thing. So, I like to watch the sessions back.

[00:01:55]
So there's an alternative tool, which is called Lookback. And you can use Lookback both for unfacilitated and facilitated testing. And I highly recommend it. You can do facilitated testing where you moderate it and you take people through the process. I mean, you could do that in Zoom. But I prefer Lookback cuz it's got certain things that Zoom doesn't have.

[00:02:24]
So for example, you can add annotations while someone's talking and say that was a good bit, let's highlight that or let's make a note on that. Also after the event is easy to edit those down and take out high lights and low lights and kind of show it off to the people.

[00:02:39]
And it does a complete transcript which is useful because you can search through it and stuff. It's not massively expensive tool to get for just a month when you wanna do a bit of testing. So it's worth having a go with these tools trying them out. It can feel a bit intimidating to be honest with you doing this kind of testing, but my advice is, the first time you do it, just don't tell anyone you're doing it.

[00:03:07]
And just kinda do it under the radar. That way, if you make a complete hash of it and it's a nightmare, nobody ever needs to know. [LAUGH] If you've not done it before, but have a play with these things. I think testing can become such a formal thing.

[00:03:24]
And like so, you're very grown up proper discipline. And so we've kind of lost that stuff or go see what happens, what's the worst that can happen? And of course that's how I learned everything in my career cuz I'm like you guys, I didn't get proper training cuz I predated all of that.

[00:03:41]
And I think we've lost that a little bit that kind of why now it's all I must do it properly, I must do best practice and all of that. And with my generation, it was like, well, let's give it a go and see what happens, and then you learn from doing that so, give it a go see what happens.

[00:03:58]
Little bit tips for facilitated usability testing, if you ever choose to do that. And again, I go into this a lot more detail. And now I'm going through this very fast, but I've got a whole separate course on this. Be very clear, a lot of people do testing, and this doesn't just apply to facilitated usability testing, but testing generally, they do testing because that's what you're supposed to do, right?

[00:04:25]
Rather than testing to answer a specific question, or to achieve a specific objective. So, this is why I do lots of little tests throughout my process, because something will pop into my head, I wonder whether people are gonna spot that thing I just mocked up. All right, let's do a bit of testing and find out, or, which of these two options is more important to people?

[00:04:51]
Let's run a quick survey to find out. There's always a very clear objective in the testing that I'm doing rather than, it's about that time we better do some testing, why are we doing this? Try and make your tasks as realistic as possible, rather than some abstract stuff.

[00:05:11]
Sometimes I can't think of a good example now, but often tasks are like a bit obscure, or why would anybody be doing that anyway, you shouldn't be focusing on testing those kinds of things. Prepare a script, I find helps a lot, especially if you've not done facilitated usability testing before, and I talked through what should go in that script a lot more in the other course.

[00:05:35]
But you've gotta be flexible, right? People will do something really weird to partway through, and don't just move on to the next question, dig down on that, right? Or they'll make a comment like, I don't know, yeah, I don't get quite what's going on here. Well, what do you mean, what is it here that you don't know, what's going on with or what's confusing you about that?

[00:05:59]
Ask lots of follow up questions, so see your script as a starting point, not as a straight jacket. Work hard to relax people, if you do facilitated testing, and actually that's one of the drawbacks, although I love facilitated testing because it means I get to ask a lot of follow up questions, I get to know people a bit.

[00:06:18]
It's more, It provides better kind of connection and empathy. Its drawback is that people feel awkward doing it. They're worried about offending, they're worried about being watched, they feel like they're going to do it wrong. So you've got to work really hard to relax people. And whether you're doing facilitated or unfacilitator, get people to think out loud, right?

[00:06:44]
You really have to hammer that home with unfacilitated because you don't get to follow up. You got to say it very upfront, we need you to speak out loud. We need you to say what you're looking at and what you're thinking and you're what you're intending to do next and all that kind of stuff.

[00:07:00]
But at least we've facilitated, if they go quiet, you can say, so what are you thinking? What's going through your head? What are you looking at, that kind of thing? Try and remain neutral, and this applies to all testing. So wording questions and tasks is so easy to introduce bias in it or give away stuff.

[00:07:21]
If somebody's got to go into, I don't know. The DIY category of an e-commerce site, right? I don't know why that was the first thing to pop in my head. I hate the other way. But yeah, if you gotta go into the DIY category, you don't word the question along the lines of, go and buy a DIY product cuz that's just too leading.

[00:07:49]
You say something like, you're planning to build some shelves in your bedroom and you need some tools to help you deal with that, where would you look? You see the difference. One's much more of a scenario, a story that avoids those those leading biases. Oftentimes, again, ChatGPT is pretty good for this right?

[00:08:16]
I often will take my questions or tasks drop into ChatGPT and say, help me ensure this is neutral and I'll say, I'm using this as a question in a survey, or I'm using it in usability testing, ensure this question is as neutral as possible. If they've got any questions, I tend to answer those at the end of sessions rather than during the session because I don't wanna feed them additional information, so I apologize by not answering the questions that they have, cuz am I doing this right?

[00:08:51]
Does this mean this? And that kind of thing, and you don't wanna be answering those but you do wanna pick them up in the end, and make a show of showing your gratitude afterwards. And maybe even giving them something, an Amazon voucher or stuff like that really helps if you're gonna do that kind of in-person stuff and debrief quickly afterwards, it's amazing how quickly you'll forget everything that you've just learned.

[00:09:15]
I'm very aware that I've just thrown a lot at you and this is what we go into much more in the use of research and testing, which is called budget friendly and lack of resources. That's the whole thing with that. If you really short on time, drop facilitated testing.

[00:09:34]
It takes forever to do, right? So that one goes out the window straight away. I would focus unfacilitated usability testing can work really well, but you don't need to test so many people. Three, do you know that if you test with just three people, you will find about 75% of the possible issues, which ain't bad, is it?

[00:09:59]
If you got to sit, that turns into about 90% and anything more than that you're wasting your money, right? So with usability testing just tests with three people unfacilitated remote testing. With some of the others, you don't need to use all these techniques. Just pick whichever one that seems most appropriate in a situation.

[00:10:19]
So if you have stakeholders moaning about whether people will spot something, do a five second test to see if they remember the thing. If they think that the look and feel is terrible, then do a semantic differential survey, right? So it depends on the issues you're coming up against as to which testing methodology to use.

[00:10:45]
And that's what we kind of get to it on the other course. But hopefully that helps a little bit. So usability testing is great as a just general sanity check, but then the rest of them is specific to circumstances, okay? Which to pick [LAUGH] is actually the next slide really, so I'm almost kind of getting into the same thing.

[00:11:07]
I mean, when it comes to unfacilitated testing, if you're gonna consider that, it's easier to run. You can do it with more people if you want to, because you don't have to sit through every session. Tend to get results faster, because you don't have to arrange people to come in and all the rest of it or syncing calendars and all that kind of stuff.

[00:11:29]
So I find it better thought is it working kind of testing? Does this work, does it not? Do people understand it? Do they not? Well, facilitated allows more guided interactions. So this is particularly good if your prototype is a bit rubbish, [LAUGH] right? Bits of it will break and doesn't work very well and people are likely to get lost or confused because you know you haven't covered everything yet, then facilitated testing works better because you can guide people.

[00:12:02]
It's more adaptable. You can ask follow up questions go off on tangents. You'll learn a lot more about your users through doing this. It's easier to get clarification over things people say, and it's easier to observe non verbal cues as well with facilitated. You don't spot them in the same way watching a video, I don't really know why, but you don't.

[00:12:24]
So I tend to think that I prefer facilitated when I wanna know why something's not working, right? So, yeah, the time I use facilitated the most is when I do post-launch optimization and say, I know people are failing to complete a particular task on the website. So I know it's broken already, right?

[00:12:49]
Because I've got data that shows me that, but I have no clue why. Then I'll do some facilitated testing.

Learn Straight from the Experts Who Shape the Modern Web

  • In-depth Courses
  • Industry Leading Experts
  • Learning Paths
  • Live Interactive Workshops
Get Unlimited Access Now