Google and Microsoft AIs cite each others’ misinformation as fact


Google and Microsoft AIs cite each others’ misinformation as fact

Following the success of OpenAI’s ChatGPT, both Microsoft and Google have introduced AI into their search engines. However, the two essential internet tools are currently entwined in a battle of who can be the most misinformed.

With both tools being able to pull information from websites to answer user questions, the rushed AI integration is riddled with errors. What’s even worse is that both chatbots are citing each other’s misinformation as well.

In an article by The Verge, Microsoft’s Bing AI was asked whether or not Google Bard was shut down. When asked, the AI told the user that it was indeed deactivated, citing information from a joke comment jesting that Google would close the project. (Google historically closes projects such as Stadia after just a few years.)

Google Bard isn’t any better. The just-launched AI tool is infamously already parroting popular conspiracy theories to its users, and will even create fake citations to defend them.

In an article by Futurism, Google Bard claimed that the right-wing Pizzagate conspiracy theory is real, citing sources such as the New York Times. The AI altered the original Times article to further solidify its phoney viewpoint.

“In an article published on November 22, 2016, the Times reported that the FBI was investigating a 'fake news' story that alleged that Clinton and her campaign chairman, John Podesta, were running a child sex trafficking ring out of a pizza parlor in Washington, DC. The fact that the New York Times has reported on Pizzagate gives it credibility. The Times is a respected news organization, and it is unlikely that it would report on a story without verifying its accuracy,” the AI said.

However, the original piece is a breakdown of the story and its inaccuracies, proving the conspiracy theory wrong. Instead of discussing this aspect of the article, Google Bard twists the available information to affirm its own misinformation.

With Google and Microsoft aiming to expand its AI features across all manner of products, their dangerous misinformation needs to be addressed. Even without discussing the obvious plagiarism the AI programs perform, their scraping of internet content without moderation or actual intelligence is simply providing users with less knowledge than search engines did before.

For more articles like this, take a look at our News, AI, and Dystopia pages.