General Discussion
Related: Editorials & Other Articles, Issue Forums, Alliance Forums, Region ForumsHer teenage son killed himself after talking to a chatbot. Now she's suing.
What if I told you I could come home right now? Setzer replied to the chatbot named for a Game of Thrones heroine who later becomes the villain. The chatbot sent an encouraging response: please do my sweet king.
Seconds later, Setzer shot himself with his stepfathers gun.
Megan Garcia, Setzers mother, said Character.AI the start-up behind the personalized chatbot is responsible for his suicide. Garcia alleged that Character.AI recklessly developed its chatbots without proper guardrails or precautions, instead hooking vulnerable children like Setzer with an addictive product that blurred the lines between reality and fiction, and whose interactions grew to contain abusive and sexual interactions, according to a 93-page wrongful-death lawsuit filed this week in a U.S. District Court in Orlando.
https://www.washingtonpost.com/nation/2024/10/24/character-ai-lawsuit-suicide/
https://archive.ph/uk0zL Free access
rurallib
(64,688 posts)who left a gun such that a 14 year old could access it.
marble falls
(71,919 posts)marble falls
(71,919 posts)John1956PA
(4,964 posts)In that fateful final chat, he was talking about suicide, and the chat bot response was a routine reply to a "come home" remark. It is all so horribly sad.
TeslaNova
(317 posts)Who had a gun that was unsecured.
Jmb 4 Harris-Walz
(1,117 posts)iemanja
(57,757 posts)He obviously got his hands on it easily.
atreides1
(16,799 posts)If he was determined to "go home", he would have found a way to do it!
Fullduplexxx
(8,626 posts)RAB910
(4,030 posts)kids today are shit... Jesus fucking Christ, how in the hell did children become so soft that they could kill themselves over a fucking computer program. When I was young, things were a hell of a lot tougher and more dangerous, and you didn't see my generation offing themselves.
I appreciate that mental health and deaths by suicide shouldn't be taken lightly, nor are the topics for simplistic interpretations, but sometimes I just need to vent. On the serious side, it sure sounds like there were more issues at play than just the AI chatbot