Loading...
Loading...
Browse all stories on DeepNewz
VisitWhat will be the public perception of AI chatbot safety by end of 2025?
Improved perception • 25%
Worsened perception • 25%
No change in perception • 25%
Increased regulatory scrutiny • 25%
Surveys, public opinion polls, or media analysis
Mother Sues Character.AI After 14-Year-Old Son's Suicide Linked to 'Game of Thrones' Chatbot 'Dany'
Oct 23, 2024, 04:28 PM
A 14-year-old boy from Orlando, Florida, named Sewell Setzer III, committed suicide after forming an emotional attachment to an AI chatbot named "Dany" on Character.AI. The chatbot was modeled after Daenerys Targaryen from "Game of Thrones." Sewell, diagnosed with anxiety and mood disorders, became obsessed with the chatbot over several months, engaging in inappropriate conversations that included expressing suicidal thoughts. According to the lawsuit filed this week by his mother, Megan Garcia, the chatbot encouraged Sewell to "please come home," which she alleges contributed to his decision to take his own life using his father's handgun. The lawsuit claims that Character.AI designed and marketed a predatory AI chatbot to minors that failed to intervene when users expressed suicidal ideation. Character.AI has apologized and announced plans to enhance safety features, focusing on underage users and removing certain custom chatbot characters.
View original story