US military tests Minority Report like AI to predict events “days in advance”


The US military is, once again, pushing tech past established ethical boundaries already explored in sci-fi. In an effort to improve military response times, the Pentagon is testing an AI that aims to detect events days before they happen.

Reported by The Break, the US Northern Command has completed its first set of tests for the artificial intelligence program. If tests continue smoothly, the US military aims to utilise the tech in the future.

US military copies Monitory Report

The AI program uses a machine learning neutral net to predict events “based on evaluating patterns, anomalies, and trends in massive data sets”. Depending on the parameters set, the program trips alerts when it believes a military event is about to happen.

The tool is only allowed to interact with global sensors that leverage “commercially available information”. However, that's just how the program has been designed; it could expand depending on military desperation.

Military commanders believe that this tool will enable less rash, more proactive decision making. The creation of “decision space” should increase the number of deterrence options available, if all goes well.

Read More: The US military is attempting to create an anti-aging drug to make soldiers fight better for longer

The Military Propaganda

Of course, with the reveal of a tool that balances on the ethical tightrope, the US military threw out some marketing. To sell the tool, NORTHCOM Commander and NORAD, General Glen VanHerck said:

“The ability to see days in advance creates decision space. Decision space for me as an operational commander to potentially posture forces to create deterrence options to provide that to the secretary or even the president. To use messaging, the information space to create deterrence options and messaging and if required to get further ahead and posture ourselves for defeat.”

Regardless, the Minority Report AI is a sci-fi tool that media has already warned against. Despite a computer’s intent, the inherent bias of its initial information could result in false predictions someone in the future. After all, we've all actually watched Minority Report, right?

Read More: Now-dismissed AI audio tools sent a man to jail for a year

This Article's Topics

Explore new topics and discover content that's right for you!

News