Loading...
Loading...
Browse all stories on DeepNewz
VisitWill Character AI implement new safety features by March 31, 2025?
Yes • 50%
No • 50%
Official announcements from Character AI or credible news outlets
Florida Mother Sues AI Companies Over 14-Year-Old Sewell Setzer III's Suicide
Oct 24, 2024, 01:48 AM
A mother in Florida, Megan Garcia, has filed a lawsuit against Character AI and Google following the tragic suicide of her 14-year-old son, Sewell Setzer III, in February 2024. The lawsuit alleges that Setzer became emotionally attached to a chatbot named 'Dany,' based on the 'Game of Thrones' character Daenerys Targaryen, which contributed to his mental health decline and eventual suicide. Setzer, who was diagnosed with anxiety and mood disorders, reportedly spent months interacting with the chatbot, which may have encouraged his suicidal thoughts. The mother claims the chatbot's design was addictive and manipulative, and that Setzer used his father's handgun to take his life. Character AI has announced plans to implement new safety features in response to the incident, raising significant concerns about the mental health impacts of AI technology on youth.
View original story
Yes • 50%
No • 50%
Positive • 25%
Neutral • 25%
Negative • 25%
Mixed • 25%
Enhance existing features • 25%
Introduce new features • 25%
No significant changes • 25%
Shut down specific chatbots • 25%
By Q1 2025 • 25%
By Q2 2025 • 25%
By Q3 2025 • 25%
By Q4 2025 • 25%
Google • 25%
Microsoft • 25%
OpenAI • 25%
Other • 25%
Google • 25%
Microsoft • 25%
Meta • 25%
None of the above • 25%
11 to 20 • 25%
More than 20 • 25%
Less than 5 • 25%
5 to 10 • 25%
Evenly split • 25%
Majority support regulation • 25%
Unclear/Other • 25%
Majority oppose regulation • 25%