Artificial intelligence systems paired with facial recognition are becoming commonplace for the American police force. America isn't alone in this; countries like Great Britain and especially China use the technology to make arrests. However, AI arrests aren’t perfect, especially when they inheret the same biases cops have criticised for.
Wrongful imprisonment as a result of AI arrests
Reported by WIRED, multiple AI-assisted arrests have resulted in wrongful imprisonment in the United States. One case, that of Nijeer Parks in 2019, saw the wrong person jailed for ten days due to an AI program. However, others are much longer.
In the WIRED article, three cases of AI arrests were discussed. Parks was arrested after facial recognition tools claimed he robbed a store. Additionally, 28-year-old Michael Oliver was arrested after the same tools selected him for a case where someone grabbed and smashed a teacher’s smartphone. Finally, 43-year-old Robert Williams was arrested for stealing watches worth $3,600.
These AI arrests were proven wrong. For example, Michael Oliver was proven to be at work during the incident. Additionally, the culprit didn't have any tattoos whereas Oliver has multiple. Williams was found to be streaming an Instagram Live at the time he supposedly robbed five watches. Parks was in a different city, 30 miles away, at the time his supposed crime was committed.
These aren't even the only examples of this form of technology failing. While not facial recognition, AI audio tool Shotspotter was used in the arrest of 65-year old Michael Williams. After the tool was used to arrest Williams, he was jailed for 11 months until he was eventually freed. After he was let go, Williams said:
“I kept trying to figure out, how can they get away with using technology like that against me? That's not fair. These devices are only installed in poor black communities, nowhere else. How many of us will end up in the same situation?”
The fight to ban facial recognition
WIRED reports that the three wrongfully arrested men are fighting to make AI arrests illegal. The three men, all black, explain that the tools used by police are biased against black men. This claim is not unfounded; facial recognition tech bias in non-white people has proven so troublesome that the White House has requested an AI Bill of Rights to combat it.
Via the WIRED article, AI tools pick out the wrong suspect 90% of the time. Despite this, the tools are becoming even more widespread among the police force. While cities like New York are turning against the tech, others are leaning into it heavier than ever before.
It’s a dangerous rope to walk, and it looks like widespread police organisations aren't afraid to sprint across it. While many have pointed out the dystopian nature of facial recognition, governments are far from afraid to lean into that dystopia Big Brother mentality.