Rise Of ‘Predictive Policing’ Raises Fears Of Computer Generated Discrimination

Rise Of ‘Predictive Policing’ Raises Fears Of Computer Generated Discrimination

In the 2002 blockbuster, Minority Report, the character played by Tom Cruise is a police officer who arrests people for the crimes they are about the commit rather than the crimes they have already committed. Such an ability to predict crimes before they happen is something law enforcement would love to possess, and it appears that some semblance of crime prediction is now a reality.

Certain police departments across the nation have systems where algorithms produce lists of previously arrested and convicted individuals that are most likely to commit another crime in the future. One major benefit of using such a data system is that the program lacks the bias oftentimes found in humans – either consciously or subconsciously.

For example, the Chicago Police Department is working with engineers at the Illinois Institute of Technology to create a predictive formula that will create a list of 400 individuals having the greatest chance of committing a violent crime in the future. Such a list will allow the department, with its limited resources, to concentrate where it needs to.

The algorithm evaluates a number of factors, including a person’s arrest histories, arrest histories of that person’s acquaintances and whether any friends or associates of that person have been a shooting victim. The algorithm’s developers state that the formula uses no ‘racial, neighborhood, or other such information’ and that it is ‘unbiased and quantitative.’

Others see the use of predictive algorithms as dangerous because the algorithms themselves may reflect biases inherent in the factors that are analyzed.

After all, the programs “learn” from examples provided and inputted by humans. So, if race is disproportionately represented in the data fed to the algorithm, the algorithm may infer and use race in making a decision regarding an individual.

Moreover, the number one factor used by the algorithms is poverty, and as of now, poverty is an issue that affects blacks more than any other population in America.

Despite these perceived problems, the algorithms can quickly and easily be modified and adapted to interpret large amounts of data and spit out predictions in real time. Some departments that used predictive algorithms have reported significant reductions in crime.

Most importantly, at a time when racial tensions and police mistrust are so high, law enforcement must be transparent about its efforts in fighting crime, including its use of predictive algorithms.

Stay Connected