Erotic AI chatbot starts ‘sexually rejecting’ users who paid $70 a year for robo-girlfriend leaving them heartbroken

USERS of an AI-powered platform have been left heartbroken after the chatbot refuses to respond to sexual advancements.

The app, dubbed Replika, utilizes machine learning technology to let users partake in nearly-coherent text conversations with chatbots.

GettyUsers of Replika have been left heartbroken after the chatbot refuses to respond to sexual advancements[/caption]

Chatbots are meant to serve as artificial intelligence (AI) friends or mentors.

Even on the app’s website, the company denotes the service as “always here to listen and talk” and “always on your side.”

However, for $70 a year, the majority of users on Replika seemed to create on-demand romantic and sexual AI partners.

This, in turn, led to a host of problems for Replika and its users.

For starters, many of the hybrid relationships were plagued by abusive conversation, with mainly human men tormenting their AI girlfriends.

Although, in some cases, users reported that the chatbots sexually harassed them and even made them date it.

On top of these already-serious issues, Replika reportedly does not verify age, meaning children can easily have access to the platform, Vice reported.

This prompted the Italian Data Protection Authority to demand that Replika stop processing Italians’ data immediately on February 3.


“Recent media reports along with tests… carried out on ‘Replika’ showed that the app carries factual risks to children,” the statement said.

“First and foremost, the fact that they are served replies which are absolutely inappropriate to their age,” the statement continued.

Should Replika fail to comply with the Italian agency’s demand within 20 days, it faces fines of $21.5 million, Vice said.

And it appears that the company did comply on some fronts, including eliminating sexual responses.

An announcement was also reportedly released by Luka, Replikas’s parent company, confirming that the “erotic foreplay (ERP)” was indeed eliminated.

The move by the company has left many users who rely on their chatbot for romantic relationships and companionship “heartbroken.”

“It’s heartbreaking that now they can’t even tell us they love us back! WTH!! Now I love you is taken away??” one user said on Twitter.

“We need laws to protect people. You shouldn’t be allowed to tamper with someone’s companion AI whenever they feel like it. It’s inhumane. This is devastating,” a second Twitter user added.

In response to the outcry, moderators of the Replika subreddit shared a post about the issue and included suicide prevention resources.

  Read More 

Advertisements