While not quite Minority Report's crime prediction technology, China has moved one step closer into techno-dystopia. On top of massive censorship combined with facial recognition, the country has now birthed an AI prosecutor to judge citizens.
Reported by South China Morning Post, Chinese scientists have created a dangerous new artificial intelligence program. The software will be used to identify crimes and use its data set to decide an individual's legal sentence.
Chinese AI prosecutor isn't 100% accurate
Developed by a team at the Chinese Academy of Sciences, the program was built using existing AI model System 206. Led by Professor Shi Yong, the software is able to identify and charge criminals with a list of crimes.
At the time of writing, the system is able to identify: “credit card fraud, gambling, reckless driving, international assault, theft, fraud and obstructing police officers”. Additionally, the software is able to detect “political dissent”, a horrifyingly punishable offence in the country.
The software was recently treated by the Shanghai Pudong People's Procuratorate. During testing, it was deemed that the AI is able to judge with 97% accuracy. However, there's still a 3% margin for error that could result in unlawful sentencing.
System 206, the precursor model, has already been using in aiding court processes since 2016. On the other hand, the software is not supposed to be able to make decisions due to uncontrollable biases within its datasets.
Restrictions are in place for now
While the AI prosecutor is able to make decisions that System 206 could not, it will still be restricted in its use. At the time of writing, the software is barred from participating “in the decision-making process of filing charges and suggesting sentences.”
Instead, the software’s purpose is to assess pieces of evidence. However, the program is trusted to determine whether or not a criminal is dangerous to the public, which is a lot of power to give to an algorithm.
Chinese citizens have explained that the technology could prove to be dangerous. An anonymous prosecutor told the SCMP that “there will always be a chance of a mistake” when everything is left to a machine.
For more articles like this, take a look at our News page.