With the exploding popularity of OpenAI’s ChatGPT, Google is heavily investing in its alternative: Google Bard. However, Google’s new tool isn’t perfect, often suffering from hallucinations. But what can be done about hallucinating AI?
In a recent interview, Google senior vice president Prabhakar Raghavan warned that hallucinating AI is a byproduct of modern chatbots such as ChatGPT and Google Bard. In the past, rival tech giants such as Meta have warned that AI can have lucid hallucinations during conversations.
Via Reuters, Raghavan explained that Google is working to keep hallucinations to a minimum within Bard. However, the Google executive warned that it may be impossible to stop the worrying trend of hallucinating AI.
“This kind of artificial intelligence we're talking about right now can sometimes lead to something we call hallucination," Raghavan said. “This then expresses itself in such a way that a machine provides a convincing but completely made-up answer.”
With Google planning to incorporate its Bard AI into Google Search, the artificial intelligence needs to be as accurate as possible. In its current form, the program will be able to hallucinate entirely inaccurate facts and become unable to bring itself back into reality.
False information has already been a huge issue for Google’s AI. In its official reveal tweet, Bard gave an incorrect answer to a prompt. This mistake caused Google’s valuation to plummet by $100 billion, possibly becoming AI’s most expensive mistake.
Raghavan explained that he is aware that Google is trailing behind companies such as OpenAI. However, the company doesn’t want to mislead the public by implementing an unreliable AI into Google Search.
Despite Google’s reluctance, other companies are already pushing AI chatbots into their search engines. In fact, Microsoft’s Bing search engine has already added ChatGPT implementation, despite its obvious limitations.
AI-powered search engine results are likely going to be the next stage of the internet. In the end, it may end up having a negative effect on small-scale webs, such as ours, while stealing content from larger ones.