Fast company logo
|
advertisement

Joy Buolamwini’s research helped persuade Amazon, IBM, and Microsoft to put a hold on facial recognition technology. Through her nonprofit Algorithmic Justice League, she’s now battling AI bias in other realms.

Meet the computer scientist and activist who got Big Tech to stand down

[Photo: Shaniqwa Jarvis; photographed on location at Windy Films Studio]

BY Amy Farley9 minute read

Joy Buolamwini got Jeff Bezos to back down.

In June, Amazon announced that it was issuing a moratorium on police use of its controversial facial recognition software, called Rekognition, which it had sold to law enforcement for years in defiance of privacy advocates. The move marked a remarkable retreat for Amazon’s famously stubborn CEO. And he wasn’t alone. IBM pledged that same week to stop developing facial recognition entirely, and Microsoft committed to withholding its system from police until federal regulations were passed.

These decisions occurred amid widespread international protests over systemic racism, sparked by the killing of George Floyd at the hands of Minneapolis police. But the groundwork had been laid four years earlier, when Joy Buolamwini, then a 25-year-old graduate student at MIT’s Media Lab, began looking into the racial, skin type, and gender disparities embedded in commercially available facial recognition technologies. Her research culminated in two groundbreaking, peer-reviewed studies, published in 2018 and 2019, that revealed how systems from Amazon, IBM, Microsoft, and others were unable to classify darker female faces as accurately as those of white men—effectively shattering the myth of machine neutrality. 

Today, Buolamwini is galvanizing a growing movement to expose the social consequences of artificial intelligence. Through her nearly four-year-old nonprofit, the Algorithmic Justice League (AJL), she has testified before lawmakers at the federal, state, and local levels about the dangers of using facial recognition technologies with no oversight of how they’re created or deployed. Since George Floyd’s death, she has called for a complete halt to police use of face surveillance, and is providing activists with resources and tools to demand regulation. Many companies, such as Clearview AI, are still selling facial analysis to police and government agencies. And many police departments are using facial recognition technologies to identify, in the words of the New York Police Department, individuals that have committed, are committing, or are about to commit crimes. “We already have law enforcement that is imbued with systemic racism,” Buolamwini says. “The last thing we need is for this presumption of guilt of people of color, of Black people, to be confirmed erroneously through an algorithm.” (This isn’t a hypothetical: The ACLU of Michigan recently filed a complaint against the Detroit Police Department on behalf of a man who was wrongly arrested for shoplifting based on incorrect digital image analysis.)

advertisement

Recognize your brand’s excellence by applying to this year’s Brands That Matter Awards before the early-rate deadline, May 3.

CoDesign Newsletter logo
The latest innovations in design brought to you every weekday.
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Privacy Policy

ABOUT THE AUTHOR

Amy Farley is the executive editor at Fast Company, where she edits and writes features on a wide range of topics including technology, music, sports, retail, and the intersection of business and culture. She also helps direct the magazine’s annual Most Innovative Companies franchise More


Explore Topics