Facebook is corrupting AI, making us worried about the robot uprising

You know what, Facebook might not be good for the world.

You'll have no doubt seen memes around regarding its effects on general society, especially recently, but that's just part of it.

When you've also had huge high profile scandals like the Cambridge Analytica one, it's probably time to consider just not using the service.

Well, you can now add in this little bit of news that suggests that it might also be turning AI evil.

Facebook has corrupted an AI and turned it evil

As reported by Vice, a social media chatbot that was made by a South Korean company had to be shut down because it started spouting racist and homophobic nonsense.

The company, called Scatter Lab, had to apologise for the chatbots bigotry, and it turns out that the issue was with the way it was learning how to be human. Because of the chat records it was fed and the interactions it had, it ended up turning evil

"We sincerely apologize for the occurrence of discriminatory remarks against certain minority groups in the process. We do not agree with Luda's discriminatory comments, and such comments do not reflect the company's thinking."

What's worse is the fact that this isn't even the first time that social media has corrupted an AI.

READ MORE: Xbox Series X restock date and availability tracker

Yup, this has happened before

If you cast your mind back to 2016 you might remember that a Microsoft AI chatbot did basically this exact thing.

In fact, there's an excellent report on it by PCGamesN that you can read that breaks down not only what happened, but also talks about how Taylor Swift nearly sued Microsoft because of it.

While all of these AIs are currently only plugged into social media, it's not hard to feel a little uneasy about how easy these things are to corrupt.

It kind of makes you despair.

READ MORE: PS5 restock date and availability tracker: When will more PlayStation 5 consoles be available?

This Article's Topics

Explore new topics and discover content that's right for you!

Tech
Have an opinion on this article? We'd love to hear it!