Skip to main contentSkip to navigationSkip to navigation
Facebook Home runs on an Android phone. The app was a previous attempt by Facebook to capitalise on its Android userbase.
Facebook Home runs on an Android phone. The app was a previous attempt by Facebook to capitalise on its Android userbase. Photograph: Peter DaSilva/EPA
Facebook Home runs on an Android phone. The app was a previous attempt by Facebook to capitalise on its Android userbase. Photograph: Peter DaSilva/EPA

Facebook accused of deliberately breaking some of its Android apps

This article is more than 8 years old

Social network ran experiment to see how long users would wait before giving up and going elsewhere, but people ‘never stopped coming back’

Facebook’s habit of experimenting on its customers has again led to anger, following allegations that it deliberately broke its app for a small number of users to see what they would do.

In a report from tech journal The Information, Facebook is accused of selectively crashing its Android app, for long periods of time, in an effort to discover the threshold at which users just give up and go away. But the lure of Facebook proved too strong: “The company wasn’t able to reach the threshold,” the site says, with someone familiar with the experiment adding that “people never stopped coming back”.

Even if the app was broken for hours on end, people simply used the mobile web version of the site, rather than not use Facebook.

The test only happened once, “several years ago”, but it reignites the controversy for the site around user testing. In 2014, Facebook experienced a large backlash after revealing that it had been experimenting on its users to study “emotional contagion”. It eventually apologised for the psychological experiments, which involved deliberately increasing the positive or negative content visible on subjects’ newsfeeds and then attempting to discern whether doing so made their own postings happier or sadder.

While some degree of experimentation is typical in most Silicon Valley firms, typically that entails small changes such as moving interface elements or altering designs. The emotional contagion experiment went significantly further than many were comfortable with, in attempting to directly influence users’ emotions in a negative manner.

The latest revelation has prompted similar criticism. In a world where Facebook is increasingly attempting to position itself as a crucial part of daily life (even replicating traditional functions of the state such as safety checks after terrorist attacks), disabling access for users simply to see what they would do has struck some as a dangerous excess.

Casey Newton, a writer for tech site The Verge, added that a major problem is that: “Users are almost totally unaware of these experiments. And if they do eventually find out about them, they can’t really leave – because there’s simply no other meaningful Facebook-like service in the market. That gives the company a moral imperative to treat its users honestly.”

The driving motivation for the experiments, according to the Information, was for the company to develop a backup plan in the case that its competition with Google (over advertising and search) flips into all-out war. In that event, the most powerful weapon in Google’s salvo would be to remove the Facebook app from the Google Play store, kneecapping Facebook’s ability to access most of the users of the Android operation system.

While Facebook could make its app work without the Google Play store, it would also have to develop its own replacements for many of the services provided by Google Services, including the ability to provide automatic updates and in-app purchases. Facebook declined to comment.

Comments (…)

Sign in or create your Guardian account to join the discussion

Most viewed

Most viewed