Psychology and behavioral-economics principles often help designers create interfaces that steer users in a desired direction. For example, prospect theory and loss aversion teach us that allowing users to try a service before signing up for it will increase the number of registrations. For example, consider Roadtrippers.com, a travel planning application which lets visitors immediately enter their destination and start building a trip, without creating an account. Once users have invested their own time learning the interface and planning a route, they quickly gain a sense of ownership, and will be motivated to create an account to avoid ‘losing’ their work.

The same psychology principles which drive users’ decision making also influence how designers make choices — after all, designers are people too.

Most UX design choices don’t have a single ‘right’ answer. Instead, resolving design trade-offs is heavily dependent on the context. Consequently, UX design decisions are especially vulnerable to bias from framing.

Definition: A frame is the context used to describe an idea, question, or decision. Frames heavily influence our interpretations and conclusions by emphasizing (or ignoring) certain aspects of a situation.

Framing is a well-known phenomenon in psychology. Psychologists Daniel Kahneman and Amos Tversky explored the effect of decision frames and found that the exact same information can lead to opposite conclusions, depending on the frame used to present the decision. For example, a price which is described as ‘discounted’ will attract more buyers than the same price without the ‘discounted’ label. Framing affects all aspects of UX work, from interpreting research findings to selecting design alternatives.

How Framing Affects Design Choices

Imagine you are working on a website design, and have just completed a usability test with 20 users. One task involved using the website’s search function, so you now have a numerical measurement of how many users were able to find and use the search function.

The task results could be stated in 2 different ways:

  • 4 out of 20 users could not find the search function on the website.
  • 16 out of 20 users found the search function on the website.

Logically, both of these statements describe exactly the same result, which is an objective data point. But if you’re like most people, the conclusions you come to might be very different depending on which phrasing is used.

We recently tested this very circumstance, in an online quiz. As of this writing, 1,037 UX practitioners have participated in the quiz. Participants were randomly assigned to see each version of the hypothetical study results: half saw the negative version and half saw the positive. All were asked the same follow-up question: “Should the search function be redesigned?”

The exact phrasing used to describe the finding — whether it is stated as a success rate or a failure rate — should not matter. But it does: as shown in the chart below, practitioners who saw the finding described as the failure rate were 31% more likely to believe the design needed to be redesigned than those who saw the same result expressed as a success rate. (Only 39% of users who saw the success rate supported a redesign, but among users who saw the failure rate, 51% supported a redesign — an increase of 30.7% compared to the ‘success’ group. In case you’re wondering, this difference is statistically significant at p <0.0001.)

Framing question response rates: 51% advocated for a redesign when shown a 'failure rate' but only 39% advocated for a redesign when show the same data as a 'success rate'
31% more UX practitioners agreed that a search function should be redesigned after seeing a task-failure rate, compared to practitioners who saw the exact same information expressed as a success rate.

In the real world, there's no single 'right' answer to this question. It’s a judgment call, and the best answer could be influenced by a variety of factors, such as the type of website, the overall importance of the search function, and any implementation costs. Since none of this information was provided in our quiz, ‘I’m not sure’ was technically the best choice, and it’s unsettling that only a minority of practitioners admitted they didn’t know the answer. (It’s also misleading to infer that the success rate for finding the search function in the interface is 16/20= 80% — in fact, the true success rate may be anywhere between 58% and 93% with 95% confidence, as we explain in our class on measuring user experience.)

The phrasing of research findings is an obvious example of the potential for framing effects. But this bias also impacts design choices in subtler ways. For example:

  • Incomplete decision frames, which consider only existing users and not potential future users, may overlook critical opportunities to expand an audience.
  • Overly specific frames, which ask questions like ‘should we implement a responsive version of our site to better support some tasks’, may overlook other important considerations, such as the potential search ranking benefit of mobile-optimized design.

How to Counteract Framing Bias

Framing is a necessary part of decision-making. Completely eliminating frames would be impossible, and not at all helpful, since without the context, you couldn’t compare options.

The trick is to become aware of your decision frame, so that you aren’t unconsciously overlooking important information. These three strategies can help minimize the impact of framing bias:

  1. Resist the impulse to make a snap judgment. Acting quickly is satisfying, but taking the time to explicitly think through the context yields more accurate and meaningful decision frames. In our example, consider the amount of time spent planning and running the usability study (with 20 users, this likely took more than the 40 hours spent on many small studies). Spending just a little more time thinking about the findings will vastly increase the ROI of the total investment.
  2. Gather more context before making a decision. Like the practitioners who admitted they weren’t sure of the answer — acknowledge (at least to yourself!) when you don’t have enough data to make an informed choice. Then consider how you could learn more about the situation.
  3. Experiment with different frames. Try restating your question in reverse terms, or from a different point of view. Taking a few seconds to simply flip a data point from a success rate to a failure rate, or to consider not just a percentage of failure but the actual number of people affected, is a quick way to check whether your opinion is being unduly influenced by framing.

Learn more about framing, cognitive biases, and design decision-making in our full-day course: UX Design Trade-Offs: Decision Frameworks.

Reference:

Tversky, A., & Kahneman, D. (1981). The Framing of Decisions and the Psychology of Choice. Science, 211(4481), 453–458. Retrieved from http://www.jstor.org/stable/1685855