Updated: Oct 23
What is predictive policing? While researching for future blog posts, I read a lot of articles concerning predictive policing. It reminded me a lot of the Tom Cruise movie Minority Report. In it, there is a program called PreCrime that predicts murders using specialized mutated humans. While this is hardly the way to go about modern-day predictive policing, the premise of stopping crime before it happens remains. Therefore, I set out to determine what predictive policing is, understand applications, and how it can potentially perpetuate existing racial biases.
Basically, predictive policing is the attempt to forecast crime thanks to algorithms. However, since these programs rely on historical data, experts warn that these systems reinforce racism. Predictive policing involves using algorithms to analyze massive amounts of information to predict and help prevent future crimes.
Place-Based and Person-Based Predictive Policing
The place-based predictive policing approach utilizes preexisting crime data to identify areas and times with a high risk of crime. An example of this method is PredPol, a software pioneered inside the LAPD in the late 2000s. While predictive policing advocates argue that computer algorithms can predict future crimes more accurately and objectively than police officers, critics warn that these systems will lead to more aggressive policing in communities of color. They come to these conclusions since these systems base on reported crimes. The logical consequence of this is that future crimes are more likely to be predicted in neighborhoods where the crime rate is already relatively high. Therefore, the basic assumption for place-based predictive policing could be understood as follows: after one crime is committed in an area, more are likely to follow.
In his article Policing Predictive Policing, Andrew G. Ferguson examines the shift from placed-based property crime to place-based violent crime. To him, the "insights of a rather rigorous empirical and scholarly approach to studying property-based crimes have been adopted without equivalent empirical studies to the problem of violent crime. While similar logic prevails, equivalent research does not".
Besides place-based predictive policing, there is supposedly another way to predict crime: person-based predictive policing. The focus on person-based policing lies in the attempt to identify individuals expected to be involved in crime or are more likely to fall victim to one. By analyzing risk factors such as past arrests or victimization patterns, algorithms try to find patterns. Therefore, sophisticated data programs map shootings and study underlying human connections.
Field-Test of Predictive Policing Systems
In September 2011, the LAPD field-tested their predictive policing system called Los Angeles Strategic Extraction and Restoration - LASER in short. It was tested in the nine-square-mile Newton Division south of downtown, home at the time to 44 documented gangs and a high concentration of gun-related crime. Designed to point out where gun violence was thought likely to occur, five corridors were identified. Furthermore, as Eva Ruth Moravec writes in her article Do Algorithms Have a Place in Policing?, LASER identified people "were likely to be involved in crime based on a point system. Known gang members, for example, got five points. The worst offenders—the ones with the most points—would be featured in something called a "chronic offender bulletin," which looked like a wanted poster, circulated among officers."
Issues With Predictive Policing
Even though early studies into some predictive policing applications such as PredPol have concluded that the algorithm was about twice as successful at predicting crime as human analysts, there are potential pitfalls. For instance, research showed that using historical data to work out trends can skew crime statistics, thus having areas under even more scrutiny. Not only is this true for places, but also humans, and therefore, critics have voiced privacy concerns. Other pitfalls can include the overreliance on technology and the misunderstanding of casual relationships.
Just like PreCrime in the Minority Report, according to Tim Lau, "LASER was shut down in 2019 after the LAPD's inspector general released an internal audit finding significant problems with the program, including inconsistencies in how individuals were selected and kept in the system".
I really hope that we won't live in the future as portrayed by the Minority Report - what about you? And as always, stay curious!