Police-used Crime Predicting AI isn’t even right 1% of the time

RoboCop on top of a map generated by crime predicting AI tool Geolitica


RoboCop on top of a map generated by crime predicting AI tool Geolitica

The Minority Report dream of crime predicting AI was proven inefficient and morally corrupt in Tom Cruise flick, but modern-day tech companies have still tried to recreate the technology. As it turns out, crime predicting AI is terrible at one particular thing: predicting crime.

An investigation from Wired delved deep into the worrying trend of predictive policing. With hundreds of police departments in the United States adopting crime predicting AI, the hope is that the technology is at least semi-accurate. It’s not.

The investigation focused on a predictive policing tool known as Geolitica from one of the only departments willing to share its predictive policing statistics. As expected, Geolitica’s success rate was drastically far below what should be used in any official police investigation.

In official cases, Geolitica’s success rate at predicting crimes was below 0.5%. Based on an already existing algorithm previously known as PredPol, the crime predicting AI has failed to properly predict crimes, and still falls victim to the same dataset biases and ethical concerns that plagued the software years ago.

The investigation into Geolitica’s success rate looked at over 23,000 predictions over the past five years. From December 2018 to the time of writing, Geolitica only successfully predicted 100 crimes. Furthermore, the tool only suggested a crime would take place, not necessarily detecting what crimes would happen. Of the 0.5% of accurately predicted crimes, 0.6% of that data was correctly identified as a robbery.

To the police department’s credit, the poor quality of Geolitica’s crime predicting AI has resulted in the tool being internally abandoned by the Plainfield Police Department in New Jersey, the only PD willing to discuss the tech’s issues. However, other departments may be relying on the tech’s faulty predictions.

“Why did we get PredPol? I guess we wanted to be more effective when it came to reducing crime," said Plainfield PD captain David Guarino. “Having a prediction where we should be would help us to do that. I don’t know that it did that. I don’t believe we really used it that often, if at all… we ended up getting rid of it."

Police assistance AI tools aren’t only designed to predict crimes. Infamously, the AI sound detection software ShotSpotter, designed to identify shooters via CCTV, incorrectly jailed a man for months over the murder of another person. Unsurprisingly, adding AI into such important processes has majorly resulted in inaccuracies and wrongful arrests.

As for PredPol, or Geolitica, the crime predicting AI isn’t cheap. At the time of writing, Plainfield PD spend $20,500 on an annual subscription fee for the service that was resubscribed to for $15,500 for its second year, and the service doesn’t work.

While the world is mostly focused on the dangers of generative AI such as ChatGPT, Midjourney and more, the more nefarious uses of artificial intelligence are going unnoticed. After all, crime predicting AI isn’t the only predictive tech used by United States government bodies, the US military is also investing in a war prediction tool that could be even more dangerous.

This Article's Topics

Explore new topics and discover content that's right for you!

NewsTechAIDystopia