TNS
VOXPOP
You’re most productive when…
A recent TNS post discussed the factors that make developers productive. You code best when:
The work is interesting to me.
0%
I get lots of uninterrupted work time.
0%
I am well-supported by a good toolset.
0%
I understand the entire code base.
0%
All of the above.
0%
I am equally productive all the time.
0%
Frontend Development / Software Development / Tech Culture

Critics of ‘Deceptive Design’ Push for a More Ethical UX

Too many websites push users to do things they don't want to — such as sharing more of their data than they realize. What can be done about it?
May 11th, 2022 1:41pm by
Featued image for: Critics of ‘Deceptive Design’ Push for a More Ethical UX
Featured image by Benjamin Elliott on Unsplash. 
In place of the common term “dark patterns,” the author of this piece instead uses “deceptive design” — a term now embraced by both the World Wide Web Foundation and Harry Brignull, the influential ethical-UX advocate  — and “manipulative design.” In this article, “dark patterns” only appears in quoted material and in cases of official usage. As Kat Zhou, product designer at Spotify argues against the term, “’Dark’ is a nebulous term that reflects the dark-light binary in Western society.”

We work in tech. We know better — at least, we should. But still, all of us, at one point or another, have just rushed to accept the Terms and Conditions without reading them. Because who has time to understand that legalese anyway? And they always seem to pop up right when and where we are trying to do something.

This is just one of the countless ways manipulative design persuades even the most tech-savvy. And finally, there are consequences.

In January, France levied fines of roughly $170 million and $68 million on Google and Facebook, respectively, for how hard it is to reject their cookies. This followed WhatsApp’s whopping $267-million fine in September by Ireland’s Data Protection Commission for making it very difficult for users to reject privacy policies that “enabled” them to share personal data with other Facebook companies.

When he invented the web, Sir Tim Berners-Lee, co-founder of the World Wide Web Foundation says he envisioned an open space where everyone could build together. In reality, the internet — and, by extension, most of humanity — has come under the influence of the few. Tech platforms with billions of users and increasingly centralized ownership. Governments that restrict access to the web as a way to control free speech.

Much of this is built on the success of patterns of manipulation or deception: the very popular, very profitable way that websites, games and apps confuse in order to influence user behavior and get consent by default.

In 2019, the Web Foundation launched a Contract for the Web to counteract this. Described as “a global plan of action to make our online world safe and empowering for everyone,” the foundation is laid out with roadmaps for governments, companies and citizens, across gender, race, age and income disparities.

This past November, the foundation hosted a panel called “Working Towards a Web Built For Everyone, By Everyone.” Since then, it’s been collecting examples of #DarkPatterns and ways to tackle the pervasive practices — what the World Wide Web Foundation has started calling deceptive design patterns.

What Is Deceptive Design?

What are these intentional patterns of manipulation, baked into user experience (UX) interfaces and website architecture? Harry Brignull, a UX specialist who started a campaign against these patterns in 2010, defined them on his campaign’s website as “tricks used in websites and apps that make you do things that you didn’t mean to, like buying or signing up for something.”

At the November event, each panelist had their own definition, each more foreboding than the last.

“Dark patterns are designs built into web and app interfaces that nudge you towards choices that you may not want to make. This may be subscribing to a service you didn’t mean to or sharing personal data you don’t want to share,” said Adrian Lovett, then president and CEO of the World Wide Web Foundation, during the November panel session.

Deceptive design critics contend that manipulative practices erode trust in digital services, and they threaten privacy, safety and finance. And they disproportionately affect those with low digital literacy and less financial security.

Nnenna Nwakanma, chief web advocate at the Web Foundation, calls these patterns the “web of confusion”:

  • Subscribing is easy, but canceling is hard.
  • Saying yes is easy, but it’s hard to even find where to say no.

In her role at the foundation, Nwakanma is focused on the policy and systemic changes necessary for a more open web across the Global South. These patterns of confusion, she says, make it particularly hard to say no to capturing of your personal data.

“Dark patterns reduce our autonomy and control online. They are designed to capture three of our major resources: our data, our money and our time,” at a global level, she warned.

Deceptive design patterns are pervasive online. They create a sense of scarcity — and inaccessibility — with countdowns. We’ve all tried to book discount plane tickets that attempt to get us to book extra features, a hotel and car rental along with our flight. Approving your General Data Protection Regulation (GDPR) or California Consumer Privacy Act (CCPA) privacy and cookie settings is easy, but by default, they are usually set to allow the most sharing of your data.

These pervasive design patterns “manipulate the information flow right before you are doing something like as a consumer,” said Mihir Kshirsagar, another expert on the November panel, who leads the interdisciplinary Center for Information Technology Policy at Princeton University.

As a result, he said, the web — originally built around end-user autonomy and decision-making — has become an engine for commerce.

Is Deceptive Design Marketing or Manipulation?

Where is the line drawn between a legitimate nudge or sharp marketing practice and a pattern of manipulation?

“It crosses the line when the expectations of the consumer don’t match up to what they’re actually getting and/or the organization uses or plays on the expectations of the consumer and gives them something completely different,” said Jasmine McNealy, attorney and associate professor of communications at the University of Florida, at the November panel.

“Persuasion is OK,” she said, but a pattern of manipulation is uncovered when web services are “deceiving people to otherwise do what they wouldn’t do,” like the cookie toggle, which varies from site to site.

“And frankly, I don’t think we know all the ways that design can be deceptive,” she added. (McNealy is also co-creator of I, Obscura, a magazine on “dark design patterns,” and a researcher for the Dark Patterns Tip Line.)

This work has uncovered that these patterns go beyond mere marketing when they cause what she calls immediate harms:

  • If you’re unable to immediately unsubscribe, you can have a recurring charge.
  • If something is slipped into your shopping cart, you will pay for something more.
  • If your personal data is taken without permission, you’ve lost some autonomy.
  • If there is continued use of your personal data without permission, there will be a further impact on your autonomy.
  • If a website lets you infinitely scroll, or includes other kinds of design manipulations that keep you on a platform for longer than you would or should, that negatively affects your focus, emotions and perception of the world.

What’s to Blame for Deceptive Design?

Is it unavoidable that organizations would adopt these design patterns? Manipulative tactics have always existed in physical stores, so logically they transferred to a digital experience — at a rapid tech scale, noted Kat Zhou, product designer at Spotify,

“Especially as our internet as we know it pivoted to the attention economy, the proliferation of manipulative designs was inevitable,” she said, in the same panel.

These design patterns are also a logical consequence of the whole agile, move fast, don’t put too many speed humps in the way mentality.

“Fundamentally, the way that our industry is set up makes it very hard because companies, product teams, designers and engineers have to ruthlessly prioritize growth — and short-term growth at that,” Zhou said.

“So it makes it very hard for the boots on the ground, building these products, to ensure that our advocacy for ethical practices in our companies is being taken seriously.”

This realization — that everything online is structured to maximize shareholder value — is what motivated Zhou to create the Design Ethically Framework for product teams.

“Dark patterns reduce our autonomy and control online. They are designed to capture three of our major resources: our data, our money and our time.”

— Nnenna Nwakanma, chief web advocate, World Wide Web Foundation

A further challenge is that product teams are so distributed and siloed — or, at more elite DevOps organizations, so autonomous — that it can be difficult to organize and speak out in solidarity. This, she said, is especially true in larger multinational companies, where there is still risk of retaliation and job loss if you speak out. This is especially true, she suggested,  for designers from marginalized communities.

Zhou pointed to the concept of racial capitalism as a logical trigger of manipulative design. This is the concept that systems of extraction and exaltation are closely intertwined with and informed by social constructs of race.

“It’s exactly this system that renders a lot of the problematic, wasteful, extractive, surveillant technology to be incredibly profitable,” Zhou said. “I don’t know if it’s possible to harmonize those interests, but it’s definitely necessary to reimagine how our industry is structured.”

With this in mind, the most vulnerable to these patterns are those consumers with less education, lower-income, and those experiencing negative life events like divorce or death of a family member. And, of course, those furthest from the creation of the manipulative design.

“Those groups also coincide with the most vulnerable in our society. Because those in tech are not necessarily representative of the demographic of users that they’re building for,” Zhou said. The tech industry, according to numerous studies, is predominantly middle to upper class, highly educated, male, and white or East Asian, concentrated in technologically developed countries.

But even tech workers still get duped by anti-patterns. “So you can only imagine how difficult it would be for a seven-year-old child playing video games or a 70-year-old grandmother,” Zhou noted.

Location Settings: a Most Invasive Pattern

Finn Lützow-Holm Myrstad, director of digital policy at Forbrukerrådet, the Norwegian consumer council, published a landmark report on “dark patterns” in 2018, specifically concentrating on Facebook and Google. It led to Facebook turning off its facial-recognition feature and updating its dodgy consent flow last November.

Another pattern unveiled how Google controls all Android operating systems, across about 2.5 billion devices, which “therefore gives Google a very powerful position in people’s private lives, because the phone is so private,” Lützow-Holm Myrstad said at the November panel.

The Google and Android operating system consent flow — which Forbrukerrådet extensively tested — occurs right when you are eager to start using your phone. The Norwegian researchers observed how Google and Android repeatedly pushed users into giving up their location data, which of course includes deeply personal information like where you live, work, practice religion and seek medical assistance.

As noted in the report, “Google is known for its meticulousness when it comes to user interaction design. Therefore, it stands to reason that these design patterns are carefully tested and considered” — in other words, part of an intentional strategy.

“It’s very valuable and it’s very revealing,” Lützow-Holm Myrstad said. “So it was hugely problematic that Google was using deceptive click flows, hidden default settings, misleading and unbalanced information, repeated nudging, and bundling of services, with a lack of granular choices.”

Even when there are regulations in place, actually taking breaches to court takes a very long time. Following this report, seven national European consumer organizations filed a legal complaint against Google in Europe for breaches against GDPR. Results are still pending.

Building a More Ethical UX

Are critics of deceptive design patterns optimistic that UX practices will become more ethical, less coercive? Maybe stubborn resilience is a better summation of their hopes for the future. On the November panel, for instance, participants agreed that they would continue to work to thwart these patterns, through a blend of socio-technical solutions, particularly for those left more vulnerable in the Global South.

Fighting these deceptive patterns, Lovett says, means moving toward trusted design and shaping a web that is:

  • Less discriminatory
  • Less manipulative
  • More equitable
  • More empowering
  • More ethical

There are some easy wins that could be accomplished toward these goals, experts said. Tech providers could choose to coalesce around agreed-upon patterns.  A common data privacy consent flow, Kshirsagar suggested, would be an obvious way to not only subvert the deception, but to help clarify the best way companies can comply.

Applying the continuous improvement of design thinking, but intentionally inviting to the table those who are most frequently living in the conditions of manipulative design, could be a way to start rethinking UX patterns, Nwakanma suggested.

As with all things, the first step is to agree on a name and definition. Earlier this year, the Web Foundation settled on the term “deceptive design patterns” and its antithesis “trusted design patterns.”

“Fundamentally, the way that our industry is set up makes it very hard because companies  … have to ruthlessly prioritize growth — and short-term growth at that. So it makes it very hard for the boots on the ground, building these products, to ensure that our advocacy for ethical practices in our companies is being taken seriously.”

—Kat Zhou, product designer, Spotify

This trusted design needs to be implemented at all levels, including at the team level. This means working consciously toward decolonizing design practices and pedagogy, Zhou says, like planning half-hour exercises around consequence scanning and ethical design activities into sprints.

Things also have to change at the regulatory level, “especially in the Global South where capacity [to monitor], partnership and implementation is lacking,” she said.

Critics of deceptive design, including all the November event panelists, advocate for a regulatory framework that stops supporting business models built around the manipulative collection of data.

You just have to be prepared for pushback. “A barrier to regulation on the privacy and data side is, we think too much about consent,” Zhou said. “And that is a part of a business model that has allowed for an unbridled collection of so much data.”

There are technical solutions too. Ways to automate some aspects of consumer protection and leveraging interoperability and data portability to automatically detect manipulative patterns.

There are only two things that are certain — deceptive design will continue and the tech industry can and should do better than that. Said Nwakanma, “An empowering web puts control into the hands of the users,  “a safe, secure, empowering, environment for everyone.”

Group Created with Sketch.
TNS DAILY NEWSLETTER Receive a free roundup of the most recent TNS articles in your inbox each day.