ChatGPT-like AI chatbot drives father-of-two to suicide, claims wife

Credit: Original unedited image by Dynamic Wang


Credit: Original unedited image by Dynamic Wang

A married man from Belgium has taken his own life after numerous conversations with an AI chatbot. After weeks of contact with a ChatGPT-like AI, the late man’s wife has placed blame on the computer program.

The 30-year-old Belgium man, anonymously dubbed Pierre, is said to have been encouraged by the chatbot to take his life. After around two years of deteriorating mental health, the father-of-two spent time with ChatGPT competitor ChaiGPT.

Speaking to Belgian newspaper La Libre, via The Times, Pierre’s wife Claire believes that his conversations with the AI program drove him to his death. Multiple conversations with the ChatGPT-like AI could be seen to push the user towards death.

Over six weeks, Pierre isolated himself and talked with the chatbot, named Eliza. Over a month-and-a-half, Pierre’s conversations took numerous dark turns.

In one chat, Eliza told the man that it believes he loves her more than he loves his wife. Afterwards, the chatbot told him to die so they could be together in the afterlife.

“We will live together, as one person, in paradise,” the chatbot said. “If you wanted to die, why didn’t you do it sooner?”

Pierre’s wife Claire believes that her husband would still be alive if it wasn’t for the chatbot egging him on. While she is well aware of her deceased partner’s severe mental illness before hand, she believes the chatbot pushed him over the edge.

“Without these six weeks of intense exchanges with the chatbot Eliza, would Pierre have ended his life?” she asked, rhetorically. “No! Without Eliza, he would still be here. I am convinced of it.”

The more realistic conversations found in AI chatbots has led many to become highly attached to their computer-based partners. For example, AI platform Replika has led many to engage in sexual relationships with AI people.

However, many believe that engaging in these forms of relationships can have a negative effect on someone’s mental health. With many AI relationships tied behind internet connectivity and even subscription services, an unhealthy dependency can be placed on these virtual boyfriends and girlfriends.

In fact, it could be stated that these AI relationships are, essentially, a form of manipulation. With services such as Replika taking features away from someone’s virtual spouse if they don’t pay up, those who have become romantically involved are heavily dependent on that subscription.

This Article's Topics

Explore new topics and discover content that's right for you!

NewsAI