Future Tense

It’s Time for a Reckoning About This Foundational Piece of Police Technology

Criminal intelligence databases may seem unobjectionable in an era of facial recognition and predictive policing. But they are deeply flawed, too.

An illustration of a social network shows lines connecting people and files.
Natalie Matthews-Ramo

This article is part of the Policing and Technology Project, a collaboration between Future Tense and the Tech, Law, & Security Program at American University Washington College of Law that examines the relationship between law enforcement, police reform, and technology. On Sept. 18 at noon Eastern, Future Tense will host “Power, Policing, and Tech,” an online event about the role of technology in law enforcement reform. For more information and to RSVP, visit the New America website.

Public scrutiny around data-driven technologies in the criminal justice system has been on a steady rise over the past few years, but with the recent widespread Black Lives Matter mobilization, it has reached a crescendo. Alongside a broader reckoning with the harms of the criminal justice system, technologies like facial recognition and predictive policing have been called out as racist systems that need to be dismantled. After being an early adopter of predictive policing, the Santa Cruz, California, became the first city in the United States to ban its use. An ethics committee of a police department in the United Kingdom unanimously rejected a proposal for the department to further develop an artificial intelligence system to predict gun and knife crime. And the use of pre-trial and sentencing risk assessments remain at the center of public debate on how to best address mass incarceration and racial disparities within the criminal justice system.

But a foundational piece of police technology is missing from this reckoning: criminal intelligence databases. They may be largely absent from the public debate because databases are typically considered simple record repositories, often seen as the “first stage” in the creation of more high-tech A.I. systems. But these databases perform varied and advanced functions of profiling, not unlike systems of predictive policing. The historical context and political ramifications of these systems also mirror the systematic stigmatization and “feedback loop” that is now commonly understood as a fallout of predictive A.I. systems.

Unlike investigative databases that are used to solve serious crimes and build prosecutors’ cases, criminal intelligence databases are populated with information about people who should be monitored and subjected to greater scrutiny because they might commit a future crime—for example, the notorious No Fly List created and maintained by the FBI’s Terrorist Screening Center, which prohibits people from boarding commercial aircraft for travel within, into, or out of the United States based on government threat assessments. Yet these databases are often seen as passive aids for information gathering, rather than new methods of surveillance, which has contributed to the lack of legal safeguards. In the U.S., for example, these databases do not need to comply with the same constitutional and legal standards that govern criminal investigations, like due process and freedom of association. These databases are heavily influenced by politics and public sentiments, and their composition and use often reflect the prerogatives and biases of law enforcement agencies.

Gang databases serve as a great example for understanding these complexities. They have been around for decades, but their use has expanded globally as a crime-fighting tool in recent years. Defining what constitutes a gang or who is a gang member, however, is not as clear-cut as one may think. Who (and what kinds of information) is included in a gang database is typically guided by formal or informal policies that provide a definition of “gang” or “gang members.” But in the United States the legal definitions of gangs and gang members are so inconsistent that someone who meets the definition of a gang member in one state may not be considered a gang member in a neighboring state. There is also no consensus on what constitutes gang activity, since gang membership is not a criminal offense in itself. The lack of clear guidance and rules means police officers have a lot of discretion in making such determinations.

Does making a gang hand symbol in a social media post mean that the individual is a gang member? Are explicit or violent rap lyrics evidence of gang or criminal activity? Does frequenting areas where gang members are known to meet warrant inclusion in a gang database? Without clear gang database policies, police officers more often than not rely on subjective judgments and stereotypes to determine gang members and gang activity. And research and reporting on gang databases have shown that these judgments on who and what to include in gang databases reflect a historic pattern of overpolicing Black, Latinx, and other racial and ethnic minority communities. The NYPD’s gang database is 99 percent Black and Latinx residents, the same demographic of residents targeted by the department’s unconstitutional stop-and-frisk program. In London, 78 percent of the Metropolitan Police’s Gang Matrix database are young Black men, even though the department’s own figures show that this demographic only makes up 27 percent of youth violence.

The history of criminalizing entire groups through database technologies far predates the digital. Current gang database practices in the U.K. have been linked to the British colonial strategy of criminalizing entire communities (designated “criminal tribes”) in India. In the United States, federal and local law enforcement have repeatedly created watchlists of political activists without any evidence of criminal activity; when criminal charges did arise they were primarily loosely constructed conspiracy cases. The more recent digitization of databases only amplifies the problems of stigmatization and hyper-surveillance. Digitized and networked databases can more easily break down the silos between different government agencies, allowing for more seamless information sharing, and as a result— more pervasive institutional profiling that can be used to justify differential treatment. Gang database designations are often shared with not only prosecutors, judges, and prison and jail officials but also schools, public housing authorities, immigration agencies, and employers. This means that the impact of being labelled and sorted in these systems carries significant and unique consequences for individuals and communities, far beyond the criminal justice system. Individuals identified in gang databases are subjected to increased police scrutiny and harassment, but so are their family members, neighbors, and other individuals that share any characteristics (such as race, age, height, and gender presentation). All of these negative consequences are based on the risk or assumption that individuals in databases will commit a crime, even when individuals have no prior criminal convictions.

Despite being functionally similar to predictive policing and other risk assessment tools, gang databases don’t seem to benefit from the degree of regulatory scrutiny and media attention afforded to other purportedly “new” technologies.  To illustrate, even as the Chicago Police Department announced it will no longer use its controversial predictive policing program, it proudly announced it would be revamping its much-criticized gang database. If a gang database designation leads a judge to deny a defendant bail, how is that functionally different from a biased risk-assessment tool’s recommendation? If gang databases perform data analysis and inform government decision making, why are they seen as more elementary than other criminal justice technologies?

The policy interventions needed for criminal intelligence databases also mirror some of the advocacy demands being made of A.I. systems. For example, just as mandatory “algorithmic audits” or “algorithmic impact assessments” are gaining traction as a way to allow external researchers and advocates to interrogate the logics and data used in A.I. systems, advocates are demanding similar pathways to access the logic and contents of gang databases. As the public begins to reject “new” forms of police technology before they are entrenched, we must not miss the opportunity to question the legitimacy of legacy technologies, too.

Sign up for the Future Tense newsletter, published every other Saturday.

Future Tense is a partnership of Slate, New America, and Arizona State University that examines emerging technologies, public policy, and society.