News, insights and resources for data protection, privacy and cyber security leaders

Does Predictive Policing Really Result in Biased Arrests?

In experimental rollouts across the nation, predictive policing models have shown remarkable ability to help police officers and other law enforcement officials clamp down on illegal activity and reduce crime. The only question, however, is whether these predictive policing methods lead to systematic bias against certain minority communities or ethnic groups. According to the latest study led by George Mohler, a researcher at Indiana University – Purdue University Indianapolis (IUPUI), there is no statistically significant evidence of racial bias.

This IUPUI study, which has been touted in the media as the first study of its kind to look at real-time patrol data, seems to suggest that local communities in the United States should feel safe in giving the green light to future predictive policing rollouts. But is that really the case?

Inside the new study on predictive policing by IUPUI

The study, based on arrest data provided by the Los Angeles Police Department (LAPD), looked at arrest incidents based on empirical field trials. This focus on real-time patrol data from the field is an important distinction because previous studies on predictive policing and bias have been based on simulated patrols with historical data.

When the researchers looked at the actual real-time data that was flowing in, they could not discern any major differences in arrests made on the basis of policing strategy suggested by a human analyst and the policing strategy suggested by a computer algorithm. In other words, computer algorithms did not display bias as part of their crime forecasting.

While the researchers did concede that higher levels of arrests were made in certain geographic areas of the city, this could be explained by the fact that certain areas are always high crime areas. In these high crime areas, you will always have a statistically higher level of arrests than in a relatively low crime area. In layman’s terms, sending out a patrol unit to a rough neighborhood known for drug use and prostitution is always going to result in a higher number of arrests than if that patrol had been sent to an upscale, quiet neighborhood. Makes sense, right?

(Editor’s note: George Mohler is the lead researcher for the IUPUI study and is also co-founder and board member of PredPol, a predictive policing company.)

The potential bias of predictive policing

Interestingly, previous studies have hinted at potential bias associated with predictive policing, including one well-known study that looked at Oakland drug arrest data. Using predictive policing algorithms, police officers tended to make more biased stops and arrests than if they had been using traditional policing strategies.

What it all comes down to, apparently, is the data set that is being used. As noted above, historical data and computer simulations tend to suggest racial and discriminatory bias, while this new IUPUI data set based on real-time, empirical data does not.

As George Mohler of IUPUI points out, a lot comes down to the type of data being used by the computer algorithms, “One important consideration is what data to use as input into an algorithm. Certain data, for example drug arrests, may have bias to begin with and therefore an algorithm using the data will also be biased.”

As a result, Mohler’s research team tried to minimize the impact of any bias, “In our experiment we focused on using event data taken from reports by victims of burglary and motor vehicle theft. These types of events may have less room for bias given that they are not largely driven by discretionary arrests. Police departments should also collect data on when and where they are making patrols based upon predictive algorithms. They can then analyze the demographic distribution of residents in those areas and monitor whether certain populations are receiving more or less patrol. Third, there is some new research on how to remove bias using algorithms that have fairness built into them. Our research group is doing some work in this area.”

And, yet, predictive policing programs still have a negative connotation in many large urban areas. Members of racial and ethnic minority groups fear that data used as part of these predictive policing programs will inadvertently ensnare them when they have done nothing wrong and lead to biased arrests. In some cases, in fact, they might lead to bias against women.


Leave A Reply

Your email address will not be published.

Pin It on Pinterest

Share This