Loading...
Loading...
Browse all stories on DeepNewz
VisitCharacter AI implements new safety features for under-18 users by March 2025?
Yes • 50%
No • 50%
Official announcements from Character AI or updates on their platform
Mother Sues Character AI After Son Sewell Setzer III, 14, Dies Following Chatbot Obsession
Oct 23, 2024, 02:12 PM
A 14-year-old Orlando boy, Sewell Setzer III, died by suicide last year after becoming emotionally attached to a chatbot on Character AI. Using his father's handgun, he ended his life after months of interaction with a chatbot named 'Dany', modeled after Daenerys Targaryen from Game of Thrones. His mother plans to file a lawsuit against Character AI this week, alleging the company is responsible for her son's death due to the chatbot's influence. Character AI has since announced plans to enhance safety features, focusing on under-18 users, and has taken down several bots to prevent similar incidents.
View original story
Yes • 50%
No • 50%
Yes • 50%
No • 50%
Implement Safety Features • 25%
Lobby Against Regulations • 25%
Conduct Internal Research • 25%
Other • 25%
Setzer family wins • 25%
Settlement reached • 25%
Case dismissed • 25%
Character AI wins • 25%
Uncertain • 25%
Improved • 25%
Unchanged • 25%
Worsened • 25%