User Testing

5 Mistakes People Make When Writing Usability Tasks

Published November 12, 2015 ⚡ Updated on October 2, 2023 by Markus Pirker
5 Mistakes People Make When Writing Usability Tasks

While running Userbrain we noticed 5 common mistakes that we would like to share with you in the interest of learning. And also to tell you how to avoid making those same usability mistakes.

Mistake#1: Revealing everything upfront

How do you expect your testers to deliver valuable feedback if they simply “don’t get” what your product is about?

It’s better to explain to them everything in detail right from the start so that they can start testing with a realistic expectation.

While it’s important to put your testers in the picture to give them a realistic scenario to work in, you shouldn’t miss the opportunity to have THEM tell YOU what they believe the purpose of your site, product and/or service is.

BTW, this is also a great technique for testing prototypes.

Often you’ll find a great difference between what is right and wrong.

✘ Wrong
This site targets eCommerce customers and brands, i.e. our typical user.

✓ Right
Please take a look at this site and explore it for a few minutes. Say everything that comes to your mind and tell us what you think it is about:

What can you do on this site?
What products or services are offered on this site?
Who is this site intended for?

Don’t tell people what your site is for. Allow them to articulate their first-impression and see if they understand the purpose of your site without any explanation.

Mistake#2: Asking quantitative questions

Usability testing delivers qualitative feedback on the usability of your site.

Usability testing is by no means a replacement for your quantitative marketing efforts.

While 5-8 testers can reveal 80% of your site’s usability problems (supposing your testing is ongoing) the sample size of your testers simply isn’t large enough to deliver significant quantitative answers.

So don’t ask your testers for their opinion, because you simply won’t get any meaningful results.

✘ Wrong
On a scale of 1-10, how easy is it to navigate the site? What should be improved?

Instead, you should put your testers in the position of someone who is visiting your site with a specific goal.

It’s your job to observe their behavior while trying to achieve the goal and judge for yourself if it’s easy to navigate around the site or not.

While you can capture quantitative data like success rate or task efficiency with usability-testing, this won’t work for having people telling you how efficient they are.

How should they know?

“Don’t ask for opinions, observe behavior.”
– Tomer Sharon

✓ Right
You want to find an affordable hotel for your trip to Venice next month. Go on until you reach the final checkout step.
[Observing behavior and measuring task success rate and efficiency]

Mistake#3: Using words from interface items

As soon as you start testing something, changing what you’re testing is unavoidable.

This is known as the experimenter effect.

All you can do is to minimize this effect by trying to lead your testers as little as possible.

One way to do this is to ensure that you aren’t priming your testers by using words that can be found in the interface of your site to describe the task you want them to do.

If you’re using the same words in your task as on your site your testers will unintentionally switch into tunnel vision and hunt for exactly this wording.

All you’ll learn is how your site performs on a scavenger hunt for the wording used.

“The way you design tasks could have a dramatic outcome on the results, without your even realizing it. In a testing situation, the participants really want to please you by following your directions. If the tasks direct participants to take a certain path, that’s the way they’ll go. If it’s not what real users do in the true context of the design’s use, then you may get distorted results.”

Hearing the words your participants use to name things on your site can often be very rewarding as well. Thus you get valuable feedback if the wording used by your users is similar to yours’.

✘ Wrong
You want to order this product now (Button reads: “Order now”); please proceed by setting up your account (The following step is labeled: Set up your account).

✓ Right
You want to get a gift for your girlfriend’s upcoming birthday. Try to find something according to her taste and buy this product.

Mistake#4: Ask about a hypothetical future

As a marketer it’s very interesting to understand the marketing funnel from people moving from visitors to leads.

So how about asking them in your usability test what makes them click?

During the last months we came over quite many tasks like this:

✘ Bad

Would you pay for our App?
Would you use our service?
Would you subscribe to this newsletter in the future?

Don’t ask people what they would do in the future. They simply don’t know.

– People overestimate how much money they will have in the future.
– People overestimate how much time they will have to use your product in the future.
– And they genuinely want to be nice and tell you yes.
Teresa Torres – Ask about the past rather than the future

So instead of asking for a hypothetical future, it’s much better to ask about past behavior.

You’ll learn far more understanding what triggered past behaviors of your testers than letting them imagine a fictive scenario in a nebulous future.

✓ Better
Have you ever paid for a service like this? When? Do you still use that service?

Mistake#5: Ask for preferred usage

Usability tests let you find out what’s clear and what’s not clear to people as they use your website.

Usability testing is in no way about opinions.

Even worse then asking about opinions is asking for a preferred usage.

✘ Bad
Would you either search using Version A or Version B?
Is the feature of starring interesting? Would you use this feature?

If you want to learn about preferred usage, A/B testing is the way to go.

Setup a study with a specific, quantitative goal and compare the results of both versions.

Of course you can add usability testing and test both versions but observe people and try to spot usability errors and don’t expect usability testing to deliver data about preferred usage.

“When people express an opinion, in the course of a usability test, pay attention to it, act like you’re listening to it and taking it in. But, sort of, immediately let go of it and don’t fixate on it, don’t worry about it, just let it go. And make sure that you don’t give them the impression that you’re looking for more opinions.“
– Steve Krug

Conclusion

If you want to learn more about writing tasks just read this article about turning user goals into task scenarios by the Nielsen Norman Group or take a look at job stories to get an idea about which tasks (aka jobs) you might want to test.

Back to homepage