CONSTANT FEED

Designers are using “dark UX” to turn you into a sleep-deprived internet addict

Candy.
Candy.
Image: ReutersS/Stoyan Nenov
We may earn a commission from links on this page.

Last week, Facebook rolled out an update to the design of its News Feed. The design tweaks—near imperceptible pixel spacing and color enhancements—were meant to make Facebook’s infinitely updating cascade of stories even easier to consume and comment on. ”Small changes, like a few extra pixels of padding or the tint of a button, can have large and unexpected repercussions,” wrote Facebook design leads Shali Nguyen and Ryan Freitas in an Aug 15 Medium post about the update. One of the objectives for the update was to find a way to make Facebook even more “engaging and immersive.”

Of course, the goal of any good design is to keep users engaged. But in the case of Facebook and other social media, “engagement” can easily become into “can’t-stop-scrolling,” or even addiction. The result is an often overlooked ethical conundrum to designers creating digital experiences: When is design too engaging for users’ own good?

Internet addiction is associated with poor health and obesity, social isolation and even brain damage, and the US National Institutes of Health  classifies internet addiction as a social disorder that causes “neurological complications, psychological disturbances, and social problems.” A 2014 study published in the Cyberpsychology, Behavior and Social Networking journal suggests that 6% of the world’s total population has a problem with internet use. Most of us can feel degrees of the same problem when we try to detach from our mobile phones during dinner or fail to ignore work emails on holiday.

Chinese youth use computers at an Internet cafe in Beijing Saturday June 18, 2005. China has the world's second-largest online population - 100 million - after the United States, but addiction to the Internet is increasing. The country's first government-approved clinic geared toward curing Internet addicts, has treated more than 300 addicts since opening last October. (AP Photo/Greg Baker)
Hooked.
Image: AP Photo/Greg Baker

Your brain on News Feed

Design is a crucial element in making websites and apps more addictive, explain professors Ofir Turel and Antoine Bechara, who co-authored a 2016 neurological study comparing Facebook addiction to cocaine addiction. As the first interface seen by Facebook’s 2 billion users, every tweak to the News Feed has far-reaching effects.

Facebook’s News Feed acts like “slot machine for the brain,” say Turel and Bachara. Every time we refresh the site, it generates different rewards that encourages us to keep on using it. In an email to Quartz, Turel and Bachara write:

This is like the idea that most of us like cakes. When we open a refrigerator door multiple times and see the same cake, we will not be as motivated to eat like if we opened the refrigerator door multiple times, and each time see a different cake (i.e., be exposed to a variable reward)….The objective of introducing better “reward management” abilities was to ensure users spend more time on Facebook as the rewards they are exposed to after the new abilities were implemented are presumed to be larger than before.

Image for article titled Designers are using “dark UX” to turn you into a sleep-deprived internet addict
Image: AP Photo/Andy Wong

The work is done through user experience or UX design, a specialization that considers the totality of a user’s experience while they’re using a piece of technology. Coined by Don Norman, Apple’s “user experience architect” in the early 1990s, UX design is typically used in describing the functionality of websites and apps today.

By improving the News Feed’s design, Facebook’s UX designers are eliminating all obstacles to cue users to curtail their time on the platform. ”When the interaction is smooth and requires no thought or difficult-to-remember steps, the behavior will more likely and easily become automated and rewarding,” Turel and Bacharal explain to Quartz. “[Facebook’s] ultimate objective with these moves is to retain users’ attention and keep them scrolling, posting and using their sites.” 

Designer or master manipulator?

Designing to encourage addictive behavior is a studied skill.  Legions of designers are now learning the psychology of persuasion and use these tactics to make sites and apps “stickier.” One of these schools is the Stanford Persuasive Tech Lab. Spearheaded by behavior scientist BJ Fogg, the lab teaches students about the tenets of “captology,” the study of computers as persuasive technologies.

An alumnus of the program and former Google employee, Tristan Harris described how designers learn persuasive techniques grimly: “There are conferences and workshops that teach people all these covert ways of getting people’s attention and orchestrating people’s lives.” Harris explained at TED last April. “I want you to imagine walking into a room, a control room with a bunch of people, a hundred people, hunched over a desk with little dials, and that that control room will shape the thoughts and feelings of a billion people. This might sound like science fiction, but this actually exists right now, today,” he said, describing the typical scenario in Silicon Valley product design departments.

Fogg clarified to Quartz that ethics has always been part of his Lab’s curricula, supplying several videos and published papers on the subject. Captology’s goal, he says, is to extend the user’s will and improve their well-being.

“We believe that much like human persuaders, persuasive technologies can bring about positive changes in many domains, including health, business, safety, and education,” claims the program’s website. Fogg’s lab, for instance, has helped the US Centers for Disease Control increase the efficacy of their health campaigns in Africa through SMS. Captology lab students learn how to read user data and learn how to use it to sway their choices. Their curriculum includes classes on psychology, persuasion theory and several classes how to create the most engaging content for Facebook.

But can designers really be trusted to decide what’s good for us? Should we trust designers and programmers with so much power without regulations?

A boy who was addicted to the internet, has his brain scanned at Daxing Internet Addiction Treatment Center in Beijing
A boy who was addicted to the internet, has his brain scanned at Daxing Internet Addiction Treatment Center in Beijing
Image: Reuters/Kim Kyung-Hoon

Inside the world of “dark UX”

UX design’s founding principle is utopian. According to Norman, “the first requirement for an exemplary user experience is to meet the exact needs of the customer, without fuss or bother.” Addictive, well-designed interfaces mean that UX designers are doing their jobs. And micro visual cues like a bigger “Buy Now” button, or flashy testimonials, can be just as much value-neutral tools of the trade as they are tactics in the battle for your attention.

Deloitte Digital’s UX competency lead Emily Ryan describes the tricky line designers walk every day: “At the end of the day, it’s extremely tough to say ‘no’ to a product manager who’s telling you to add some feature that we know the user doesn’t want, doesn’t need and ultimately will mar their experience in some form.” Ryan, whose personal website is aptly named “UX is everywhere,” says that conscientious designers will propose alternatives to dirty, manipulative design tactics called “dark UX“.

Dark UX is an industry term for sly design tricks that benefit the client’s bottom line. It ranges from creating defaults, such as a pre-checked opt-in email subscription or pre-selecting the most expensive options. It can also manifest in the form of interfaces requiring clients to supply their personal information before being allowed to look at the products on a website.

Ryan explains that there are different levels of UX hell. ”‘Dark UX’ is on a scale. There is “less bad” (simply bad experience) and “more bad” (detrimental to a user’s overall safety & security) and most designers would likely be somewhat ok with being on the line on the less bad side than the more bad side,” she says.

But ultimately, the customer is always right. “I wish we had more ability to push clients towards better design practices but from a realistic standpoint, I’m just not sure it’s going to do much. The client can always find someone willing to do what they want, regardless of whether it’s ethical or not,” reflects Ryan, who been studying alternatives to UX design trickery.”The irony is that these tactics rarely work and often backfire,” she explains.

As foot soldiers to a corporate or client mandate, designers rarely perceive or openly discuss the ethical dilemma in their work. While legal and and heath industries have a professional ethical code, designers do not. Stephen P. Anderson who penned the the book Seductive Interaction Design, describes the scenario aptly.

If you hire a lawyer to defend you, you expect that person to do everything in his power to prove your innocence or ensure you get a fair trial. If you hire a personal trainer to help you shed some pounds, you expect that person to use the tools and methods at her disposal to help you reach your goals. Similarly, if someone hires you to create a new homepage that leads to more sales, they expect you to use whatever skills you have to accomplish this goal. Do you hold back?

The few attempts to establish a do-no-harm moral code or “Hippocratic Oath for Designers,” have turned out to be gimmicky, resulting in cute, stylized persona mantras that confuse ethics with aesthetics. Others fizzled out from lack of collective resolution to implement a mandatory code of conduct.

For any ethical resolution to stick, designers first have to believe their work can directly cause good or harm.