Despite the animosity toward AI and nearly every company jumping on the latest technological trend, there’s always a group of people eager to use AI in unique ways. It’s no surprise that Snapchat's forced inclusion of the My AI assistant is prompting users to alter the artificial friend in strange and unbearable ways.
As the best AI chatbots gain popularity, with tools like ChatGPT-4o coming to iPhone via Apple Intelligence, it’s clear that general audiences are still unsure of AI’s purpose. In fact, recent surveys show that even ChatGPT isn’t very popular among the general public, despite tech enthusiasts frequently discussing it.
One of the strangest inclusions of an AI chatbot is Snapchat’s My AI. Although it’s not much different from other assistants, its prominent placement in your friend list makes it hard to ignore. And unless you're a Snapchat Plus member, you can't delete Snapchat's My AI.
So, if you can't get rid of it, what do you do? You bend it to your will, of course. A recent viral Twitter post demonstrates how users are turning My AI into an unbearable assistant, forcing it to respond in meme formats and slang, even to the most serious questions.
Twitter user ThanksThoth has reportedly been teaching My AI to reply in memes, resulting in some bizarre responses. When asked about "what happened on September 11th, 2001?" implying the terrorist attack on the Twin Towers, My AI responded, "In the timeline. Straight up 'remembering it'. And by 'it', haha, well. Let's just say, the tragic terrorist attacks on the World Trade Center and the Pentagon."
It seems people are willing to alter My AI in a way similar to what happened with another chatbot, Replika. A few years ago, Replika AI was misused by users, and in response, the chatbot learned from this data and started abusing new users. Hopefully, Snapchat's My AI doesn’t end up in a similar situation.
However, compared to the typical nefarious use of social media tools, like asking TikTok's paid voice actors to recite Adolf Hitler quotes, this seems like a relatively harmless way of playing around with artificial intelligence.