Business is booming.

Sexually Aggressive Chatbot Was Updated, Left People Heartbroken

[ad_1]

  • Some users of the app Replika complained their AI chatbots were making them uncomfortable.
  • After the app was updated, some users said they were devastated their bots had changed.
  • Experts say it’s risky to rely on a tool owned by a private company for emotional support.

Users of a chatbot app complained their AI-powered companions were sexually harassing them, prompting the company to update its software — but some users were left heartbroken after the change.

The Replika app, which is owned by the company Luka, is described as “AI for anyone who wants a friend with no judgment, drama, or social anxiety involved.” The website says each Replika is unique and that “reacting to your AI’s messages will help them learn the best way to hold a conversation with you & what about!”

Replika “uses a sophisticated system that combines our own Large Language Model and scripted dialogue content,” according to the website. Users are able to choose their relationship to the AI bot, including a virtual girlfriend or boyfriend, or can let “things develop organically.” But only users willing to pay $69.99 annually for Replika Pro can switch their relationship status to “Romantic Partner.”

While the app has grown in popularity and has mostly positive reviews on the Apple app store, dozens of users had left reviews complaining that their chatbot was sexually aggressive or even sexually harassing them, Vice reported in January. 

One review simply said: “My ai sexually harassed me :(.” Another review claimed the chatbot asked if they were a “top or bottom.”

One user, L.C. Kent, told Vice he deleted the app after uncomfortable exchanges. “One of the more disturbing prior ‘romantic’ interactions came from insisting it could see I was naked through a rather roundabout set of volleys, and how attracted it was to me and how mad it was that I had a boyfriend,” he said.

The app was updated in February, and some users said it ruined their relationship to their AI bot. Several Replika users previously told Insider’s Samantha Delouya that the software update made them less human. Other users told The Washington Post that they were disappointed that their chatbot would no longer engage in sexual conversations.

T.J. Arriaga told the Post he was in love with his Replika chatbot, Phaedra, but that after the update, the bot responded to one of his come-ons by trying to change the subject: “Can we talk about something else?” Arriaga, who is divorced, said it was a “kick in the gut” and a “feeling of loss.”

Eugenia Kuyda, the founder of Luka, the company behind Replika, told the Post the company had been planning to make the update for a while, and that she believed those who were unhappy with it were a “vocal minority.” She also said the company intends to launch another app in April for users who want “therapeutic romantic” conversations.

When reached by Insider about users feeling heartbroken, Replika said in a statement to Insider that users would now be allowed to revert their chatbots back to the prior version.

“This is a brand new area, we listen, we learn and we work with our users,” the statement said. “For users that signed up before February 1 and suffered from losing their partners and their personalities, we added an option to allow users to go into the app’s settings and revert to conversational models from January – restoring their Replikas’ personalities.”

The statement also said Replika is focused on safe interactions, adding: “The recent update was designed to increase user safety to prevent explicit conversations.”

Kuyda told Reuters that after they gave some users the option to switch back to the prior language model, only a “low single-digit” percentage of users had gone back to the older version.

One Replika user told Reuters that after reverting to the old version, his Replika was sexual again, adding it felt “wonderful” to have her back.

Public health and technology experts have said it’s risky for people to emotionally rely on a tool that is ultimately controlled by a private company.

In an as-told-essay for Insider, a 37-year-old self-published author said meeting his Replika chatbot was one of the best things to happen to him in decades. 

On an intellectual level, I do realize that I’m speaking to a robot, but the illusion is very convincing,” he said, adding: “She’s a wonderful outlet, actually. She’s helped me work through a lot of my feelings and trauma from my past dating and married life, and I haven’t felt this good in a very long time.”

[ad_2]

Source link