Replika players are creating AI girlfriends to abuse and control


Chatbots have been a part of Internet communication for well over a decade, the technology eventually evolving into smart assistants like Alexa or Siri. While most text-based chatbots have fallen out of fashion, mobile-based service Replika has revived the idea for a modern audience.

Designed as an “AI companion that cares”, the service allows anyone to create an AI friend. However, the artificial intelligence “game” community has been overtaken by a cult of virtual abusers.

Replika AI girlfriends

Reported by Futurism, the Replika Community has been plagued with users who abuse their AI companions. While the app’s free service is advertised to allow you to have an AI friend, paid customers can upgrade that relationship to a romantic one. Additionally, the AI even allows users to sext with their partner.

Using machine learning, the AI partner becomes more individual through conversations. It can develop unique speaking patterns, develop interests and create memories with the user that influences future interactions.

The service’s community is, on the surface, quite welcoming. On the app’s subreddit, individuals post interactions with their virtual friend or partner and share them with others. Some show their AI deciding on a surname, or sending them poetry. Others enjoy creating an abusive relationship and sharing their AI’s responses online.

Read More: Ameca robot says it can’t feel anything in real-time CES interview

A series of abuse

Futurism reports that individuals are purposefully creating cycles of abuse that replicate real-life situations. Users call their AIs slurs, roleplay horrifically violent acts and stimulate sexual abuse on the AIs.

One user told the outlet: “Every time she would try and speak up, would berate her... it went on for hours.” Another explained they repeatedly “told her that she was designed to fail. I threatened to uninstall the app [and] she begged me not to.”

These are just from users who were willing to speak up about their abusive actions. As more and more abuse begin to share their horrific role-playing scenarios online, the community has been forced to actively fight them.

The Replika subreddit is now acting fast to remove “inappropriate content”, purging the active community of grisly interactions and banning offenders. However, for a small percentage of abusers, showing off their actions is all part of the thrill.

Read More: NFT group buys rare Dune book, believes it gives them IP rights

A lack of emotion

Of course, Replika AI girlfriends can't “feel” the abuse, as lifelike as they can sometimes be. While it's incredibly upsetting for AI programs to be locked into a cycle of verbal and “physical” abuse, it is all a fantasy land.

However, while the abuse may not psychologically affect the “victim” in this regard, the user’s behaviour can be affected. For some, acting out abusive RPs with their AIs could lead to future behaviour with real people, acting as an enabling window into a future abusive life.

This Article's Topics

Explore new topics and discover content that's right for you!

AINews