A quick and dirty guide to usability testing - for your coworkers

A few years ago, I wrote a usability testing guide on my company's internal blog to serve as reference for my fellow coworkers interesting in the process. Having something like this will not only elevate your company's understanding of the process, but also serve as a way to train your coworkers and grow your team's testing capabilities. I've copied it below:

What is usability testing, anyway?

Usability testing is where you observe people trying to use a mockup, wireframe, prototype, or functioning demo with the intention of determining whether it is easy for them to use or not and identifying ways to make it easier. The type of usability testing that we usually do at SofterWare is qualitative, where the purpose is to gain insights that help us improve what we are in the process of building.

What usability testing is not: learning design preferences (whether they "like" a particular color, graphic, font, etc).

The point is, it's not productive to ask questions like "Do most people like pull-down menus?" The right kind of question to ask is "Does this pull-down, with these items and this wording in this context on thispage create a good experience for most people who are likely to use this site?"

And there's really only one way to answer this kind of question: testing. You have to use the collective skill, experience, creativity, and common sense of the team to build some version of the thing (even a crude version), then watch some people carefully as they try to figure out what it is and how to use it.

While debating about what people like wastes time and drains the team's energy, usability testing tends to defuse most arguments and break impasses by moving the discussion away from the realm of what's right or wrong and what people like or dislike and into the realm of what works or doesn't work. And by opening our eyes to just how varied users' motivations, perceptions, and responses are, testing makes it hard to keep thinking that all users are like us.

-- Steve Krug, "Don't Make Me Think"

Usability testing is just one of the many ways we can gain user feedback. Here's a brief comparison to help illustrate the differences between some of the more popular research methods:

Usability Testing This is used for observing behavior during tasks in order to detect and fix what is confusing or frustrating. This will help you learn if your site/app works (or doesn't work) and how to improve it. Usability testing should be utilized throughout the entire development process. Interaction. What people do. 🖥
Focus Groups This is where a small group of people come together to express their opinions, talk about their experiences or provide reactions to new concepts. It's great for figuring out what they say they want, need, and like (in the abstract). It helps with getting feedback on new ideas, on how they feel about us and our competitors, and how they currently solve problems our software is trying to help them with. It's typically used before you begin designing or building anything, in the planning stages. Abstract Thoughts & Attitudes. What people say. 💬
Benchmarking Testing This is similar to usability testing, however the focus is on collecting metrics like time on task or success rates. It must be more strictly controlled and recorded, and is focused on quantitative data rather than qualitative. This is great for answering questions like "How long does it take for a user to run a report?" and needs a large sample size to be significant. You can use these to track usability over longer periods of time, or compare it to the usability of a competitor's product. Metrics, numbers. 📊

Now that you have a basic understanding of what usability testing is, let's move on to creating the tests themselves.

How do you create a usability test?

First you come up with tasks for the users to do, then expand the tasks into scenarios for greater context.

Tasks depend on what you have available to test. For rough sketches, a task might simply be asking them to look at it and tell you what they think the sketch is is. The more fleshed out the sketch it is, the more detailed your testing can be. Providing a test that has some interactivity to it, such as a prototype, will give us a better idea of their behavior as opposed to what they say they would do.

📝Note: Keep in mind that the more detailed your presentation is, the more likely your feedback is going to focus on the details. This is not always a good thing. Often I find myself wishing for more feedback on larger concepts -- "Is this user flow useful at all? Should this feature be done a completely different way? Is this really solving a problem, or introducing new ones? Do we have a full understanding on what the real problem is?". However, the tendency is that when people see high-fidelity mockups that look "finished", they limit their feedback to the smaller details such as fonts, colors, and button placement.

Here are some examples of tasks you can ask the user to do:

  • Logging in using a username and password
  • Retrieving a forgotten password and/or username
  • Changing security questions
  • Choosing a recurring donation amount
  • Creating a new form
  • Adding a new donor to their database
  • Adding a new code for their events

Once you have your tasks sorted out you can build them into scenarios. Here, you turn your tasks into a story where they the user is a character who has a specific goal in mind, a particular motivation, with a few details to help them along the way. Be careful when guiding the user, use words that explain what they need to do but are not contained within the page if possible. You're testing the terminology of the page, too.

You have to phrase it so that it's clear, unambiguous, and easy to understand, and you have to do it without using uncommon or unique words that appear on the screen. If you do, you turn the task into a simple game of word-finding. For instance:

Bad: "Customize your LAUNCHcast station"
Better: "Choose the kind of music you want to listen to."

--Steve Krug, "Rocket Surgery Made Easy"
💻

Example:

  • Task: Build a new form
  • Scenario: You are an animal shelter non-profit and you have a fundraising dinner coming up in a couple months. You need to make a new online form that will register people for this event and collect their registration fee.

You can provide information that the tester might need to know in order to complete the test, (such as a username and password, an event code, an import file, or header image ready for them on the desktop) but the key to getting good data out of your test is not giving away any clues to the scenario.

Another thing to consider is the order in which you ask them to complete their tasks. Some tasks or questions can lead the user in a particular direction which can influence the data you are trying to collect, or it can act as a spoiler for a task further along in your test.

How do you prepare for a usability test?

Before you begin your usability testing session, there are a few things things you need to do to make sure that things run smoothly.

A few days before the test:

  1. Make sure the test is scheduled, confirmed with the client, and that you both have the information for connecting to GoToMeeting.
  2. Is your database set up in a way that is friendly to the user?
  3. Is it a "clean" database (clients can be confused with a lot of the jibberish set up in support databases).
  4. Has the database/account been wiped clean from the results of previous testing? As an example, if one of the tasks is to add a solicitation code, it confuses users if the code they create already exists because a previous tester added it the day before.
  5. Do a dry run/practice going through the test. Ask a coworker to give it a glance to make sure the test objectives are being met.

Shortly before the test:

  1. Test your screen recording/sharing setup
  2. Turn off anything that will interrupt the session/notifications/chat/etc
  3. Have bookmarks or pages that are part of the test already open in another tab
  4. Make sure your questions/tasks are handy for you nearby
  5. Have a method of taking notes available, or someone set as the designated note-taker

How do you run a usability testing session?

Who should do the testing?

Try to choose someone who tends to be patient, calm, empathetic, and a good listener. Don't choose someone whom you would describe as "definitely not a people person" or "the office crank".

Other than keeping the participants comfortable and focused on doing the tasks, the facilitator's main job is to encourage them to think out loud as much as possible. The combination of watching what the participants do and hearing what they're thinking while they do it is what enables the observers to see the site through someone else's eyes and understand why some things that are obvious to them are confusing or frustrating to users.

-- Steve Krug, "Don't Make Me Think"

During the test you are trying to get them to externalize their thought process. Keep them talking. Ask them what they are trying to do, what are they looking at, what are they reading, what questions do they have in their mind. Sometimes they will forget to keep verbalizing their thoughts, so you may need to keep prompting them. If you find yourself wondering what the user is thinking, just ask them.

Remember, you're trying to not influence them. Participants need to figure things out on their own. You should be probing and neutral, sort of like a therapist. Here's an example of things a therapist would say to give you a better idea on how to have what is called "permissible" phrases.

As you go through the study with participants, remember that it’s your job to be quiet and listen; let the participants do the talking. That’s how you and your team will learn. Be prepared to ask “Why?” or say “Tell me more about that” to get participants to elaborate on their thoughts. Keep your questions and body language (if in person) neutral, and avoid leading participants to respond a certain way.

During the sessions, someone will need to take notes. Ideally, you’ll have a separate note-taker so you can focus on leading sessions. If not, you’ll need to do this while moderating.

There is a fairly standard script that can guide you in how to begin the usability testing session. We don't really follow it exactly, but it is good to loosely follow the key points presented there.

Start by getting them talking for a few minutes. What do they do at their organization? How do they use our product? How much time is spent in product? Do they use the feature that is being tested?

A good opener for the work that is shown is asking what they think the site/mockup is. What can you do here? What is it for? (NOT what they think of it/opinion/do they like the colors). After that, you can present them with the scenario and give them their task(s).

Some ideas on things to observe during the test:

  • How quickly do they seem to accomplish their tasks?
  • What is their flow? (How do they get from point A to point B )
  • If the task it isn't obvious, is it at least easily learnable? Once they do it, they "get it"?
  • Where are they running into problems?
  • What sort of emotional reactions are they having?
  • Do they get lost at any point?
  • Are there common places where they need more prompting?
  • What questions do they have?

Finally

I know this is a lot to take in at once, so don't worry. Nobody gets this perfect. Pretty much every usability testing session I have ever been in has had at least one thing that could have been better. We will forget something, we'll ask things the wrong way, someone will jump in with a leading question or explain things right before we're about to ask a question about it, there will be technical difficulties, or participants will forget about it entirely and not show up. It can make you feel a bit nervous to be interacting with a client with a room full of other people, observing everything that happens. This is normal! So don't worry - it will be flawed and that is ok.

That's about it for my quick & dirty guide to usability testing! You get a gold star 🌟for reading this far. If you have any questions, just ask anyone on the UX Team and we'd be happy to help.