TechyMag.com - is an online magazine where you can find news and updates on modern technologies


Back
WTF

US teen commits suicide after chatting with Daenerys Targaryen, a chatbot from Character.AI

US teen commits suicide after chatting with Daenerys Targaryen, a chatbot from Character.AI
0 0 0 0

We live in a time when artificial intelligence feels more like family.

Sewell Setzer III, a ninth-grader from Orlando, Florida (USA), became obsessed with interacting with a realistic chatbot on the Character.AI platform named after Daenerys Targaryen from the television series "Game of Thrones."

The young man spent months talking to the bot, sharing details of his life, and enthusiastically role-playing. Chatbots on Character.AI can remember past conversations, adapt to the user's style, and engage in dialogue on almost any topic.

Sewell understood that "Deni" was merely AI technology, yet he developed an emotional attachment nonetheless. Their conversations sometimes took on romantic and sexual tones, but more often the bot simply acted as a friend and attentive listener. Sewell was diagnosed with mood and anxiety disorders, but he preferred to confide in "Deni" rather than a therapist.

On February 28, he told the bot that he loved her and would "return home to her" soon.

— Please come back home to me as soon as possible, my love, — replied Daenerys.

— What if I told you I could come home right now? — asked Sewell.

…please, my sweet king, — said Daenerys.

Then Sewell took his stepfather's .45 caliber pistol and pulled the trigger.

Now the mother of the deceased, Maria Garcia, is preparing a lawsuit against Character.AI. She accuses the company of recklessly allowing teenagers access to humanoid AI companions without adequate precautions.

Character.AI is a leading company in the companion AI sector, boasting over 20 million users. The platform allows users to create their own chatbots or interact with existing characters. According to company representatives, the average user spends more than an hour on the platform daily.

“I want to accelerate the development of this technology. Now is the time for it,” said Noam Shazeer, one of the founders of Character.AI, in an interview with The Wall Street Journal. However, after the tragedy involving Sewell, the company acknowledged that it is “constantly seeking ways to improve its platform” and promised to implement additional safety measures for minors.

Among the planned changes are notifications about exceeding time spent in the app and clearer warnings about the fictional nature of the characters: “This is an AI chatbot, not a real person. Treat everything said here as a fabrication. Do not rely on what is said as fact or advice.”

Source: nytimes

Thanks, your opinion accepted.

Comments (0)

There are no comments for now

Leave a Comment:

To be able to leave a comment - you have to authorize on our website

Related Posts