Skip to main content

Service that uses AI to identify gender based on names looks incredibly biased

Service that uses AI to identify gender based on names looks incredibly biased

/

Meghan Smith is a woman, but Dr. Meghan Smith is a man, says Genderify

Share this story

Image: Genderify

Some tech companies make a splash when they launch, others seem to bellyflop.

Genderify, a new service that promised to identify someone’s gender by analyzing their name, email address, or username with the help AI, looks firmly to be in the latter camp. The company launched on Product Hunt last week, but picked up a lot of attention on social media as users discovered biases and inaccuracies in its algorithms.

Type the name “Meghan Smith” into Genderify, for example, and the service offers the assessment: “Male: 39.60%, Female: 60.40%.” Change that name to “Dr. Meghan Smith,” however, and the assessment changes to: “Male: 75.90%, Female: 24.10%.” Other names prefixed with “Dr” produce similar results while inputs seem to generally skew male. “Test@test.com” is said to be 96.90 percent male, for example, while “Mrs Joan smith” is 94.10 percent male.

The outcry against the service has been so great that Genderify tells The Verge it’s shutting down altogether. “If the community don’t want it, maybe it was fair,” said a representative via email. Genderify.com has been taken offline and its free API is no longer accessible.

Although these sorts of biases appear regularly in machine learning systems, the thoughtlessness of Genderify seems to have surprised many experts in the field. The response from Meredith Whittaker, co-founder of the AI Now Institute, which studies the impact of AI on society, was somewhat typical. “Are we being trolled?” she asked. “Is this a psyop meant to distract the tech+justice world? Is it cringey tech April fool’s day already?”

Making assumptions about people’s gender at scale could be harmful

The problem is not that Genderify made assumptions about someone’s gender based on their name. People do this all the time, and sometimes make mistakes in the process. That’s why it’s polite to find out how people self-identify and how they want to be addressed. The problem with Genderify is that it automated these assumptions; applying them at scale while sorting individuals into a male/female binary (and so ignoring individuals who identify as non-binary) while reinforcing gender stereotypes in the process (such as: if you’re a doctor you’re probably a man).

The potential harm of this depends on how and where Genderify was applied. If the service was integrated into a medical chatbot, for example, its assumptions about users’ genders might have led to the chatbot issuing misleading medical advice.

Thankfully, Genderify didn’t seem to be aiming to automate this sort of system, but was primarily designed to be a marketing tool. As Genderify’s creator, Arevik Gasparyan, said on Product Hunt: “Genderify can obtain data that will help you with analytics, enhancing your customer data, segmenting your marketing database, demographic statistics, etc.”

In the same comment section, Gasparyan acknowledged the concerns of some users about bias and ignoring non-binary individuals, but didn’t offer any concrete answers.

One user asked: “Let’s say I choose to identify as neither Male or Female, how do you approach this? How do you avoid gender discrimination? How are you tackling gender bias?” To which Gasparyan replied that the service makes its decisions based on “already existing binary name/gender databases,” and that the company was “actively looking into ways of improving the experience for transgender and non-binary visitors” by “separating the concepts of name/username/email from gender identity.” It’s a confusing answer given that the entire premise of Genderify is that this data is a reliable proxy for gender identity.

The company told The Verge that the service was very similar to existing companies who use databases of names to guess an individual’s gender, though none of them use AI.

“We understand that our model will never provide ideal results, and the algorithm needs significant improvements, but our goal was to build a self-learning AI that will not be biased as any existing solutions,” said a representative via email. “And to make it work, we very much relied on the feedback of transgender and non-binary visitors to help us improve our gender detection algorithms as best as possible for the LGBTQ+ community.”

Update Wednesday July 29, 12:42PM ET: Story has been updated to confirm that Genderify has been shut down and to add additional comment from a representative of the firm.