A new look at the intersection of predictive policing, poverty and stress in LA
Photo via Aitana Vargas
In the past decade, the use of algorithms and data-driven technologies has become widespread among police departments and probation officers as a means of predicting crime or an individual’s risk of recidivism.
Led by then-former LAPD Chief William Bratton, the LAPD is credited with having pioneered the use of algorithmic technologies in the United States after adopting PredPol and Operation LASER some 10 years ago.
The appeal of predictive technologies quickly spread across the U.S., and some of the country’s largest police departments went on to implement similar programs, including the police departments in Chicago and New York.
According to a national survey conducted by the police Executive Research Forum in 2014, 70% of the police department representatives surveyed said they expected to implement the technology (PredPol) within two to five years, while 38% said they were already using it at that time.
Police departments see data-driven policing as effective ways of preventing crime on the grounds that they allegedly help to predict where crime will happen and who will commit these crimes.
These tools draw data — like the location and time of property crimes and theft — from crime reports and throw it into an algorithm, which analyzes and determines the areas where crime is most likely to occur. This process turns certain neighborhoods and districts of LA, such as South Central or Skid Row, into hot spots for police activity.
In the case of LASER, the software uses past crime and arrest data about offenders to create a list of possible “suspects” or “people of interest,” who allegedly are at a higher risk of reoffending and need to be closely watched by law enforcement.
Almost since their inception some 10 years ago, data-driven policing programs have come under fire, particularly in cities and counties like Los Angeles with a long history of police abuse and brutality.
Civil rights advocates and experts have long argued that these technologies reinforce structural racism and target poor and low-income communities, including Latinos and Black individuals in South Central, as well as the homeless in Skid Row.
Working out of Skid Row, the Stop LAPD Spying Coalition played a crucial role in exposing the inherent dangers of PredPol and LASER, and strategically mobilized the community until both programs were discontinued by the LAPD a few years ago. The victory, however, was short-lived, for the LAPD quickly repackaged their data-driven tools and created the second generation of these programs: Data-Informed Community-Focused Policing.
Activist Hamid Khan, one of the most prominent voices of the Stop LAPD Spying Coalition, sees PredPol and LASER as examples of how individual racial biases may be built into an algorithm for law enforcement and criminal justice purposes. In essence, he says, they constitute a way of introducing racism into modern technologies and targeting and displacing minorities, poor and homeless individuals. Khan argues it’s a way of taking over the land by powerful real estate moguls.
Leading experts in the field of predictive algorithms, such as Emily Tucker of Georgetown Law or Matthew Guarigilia of the Electronic Frontier Foundation, claim that data-driven policing fails at predicting crime and instead criminalizes the poor and minorities by creating a pattern of overpolicing of specific neighborhoods and areas.
They hold that individuals are targeted by police, stopped, searched or arrested based on their place of residence, the families they were born into, who their friends are or what school they go to.
Predictive technologies continue to evolve and reemerge in new programs (even community-focused initiatives). Just because PredPol and LASER are no longer being used by the LAPD does not mean that their past use did not have a lasting impact on the well-being, health and stress level of certain individuals and communities. Trying to unravel — and even quantify — the lasting impact that these innovative tools have on people is at the heart of this story, which I’ll be reporting as a 2023 California Health Equity Fellow.