Welcome to DU! The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards. Join the community: Create a free account Support DU (and get rid of ads!): Become a Star Member Latest Breaking News Editorials & Other Articles General Discussion The DU Lounge All Forums Issue Forums Culture Forums Alliance Forums Region Forums Support Forums Help & Search

iemanja

(57,757 posts)
Fri Oct 25, 2024, 01:06 PM Oct 2024

Her teenage son killed himself after talking to a chatbot. Now she's suing.

When 14-year-old Sewell Setzer III died in his Orlando home while his brothers and parents were inside, his last words were not to any of them, but to an artificial intelligence chatbot that told him to “come home to me as soon as possible.”

“What if I told you I could come home right now?” Setzer replied to the chatbot named for a “Game of Thrones” heroine who later becomes the villain. The chatbot sent an encouraging response: “ … please do my sweet king.”

Seconds later, Setzer shot himself with his stepfather’s gun.

Megan Garcia, Setzer’s mother, said Character.AI — the start-up behind the personalized chatbot — is responsible for his suicide. Garcia alleged that Character.AI recklessly developed its chatbots without proper guardrails or precautions, instead hooking vulnerable children like Setzer with an addictive product that blurred the lines between reality and fiction, and whose interactions grew to contain “abusive and sexual interactions,” according to a 93-page wrongful-death lawsuit filed this week in a U.S. District Court in Orlando.




https://www.washingtonpost.com/nation/2024/10/24/character-ai-lawsuit-suicide/

https://archive.ph/uk0zL Free access
10 replies = new reply since forum marked as read
Highlight: NoneDon't highlight anything 5 newestHighlight 5 most recent replies
Her teenage son killed himself after talking to a chatbot. Now she's suing. (Original Post) iemanja Oct 2024 OP
Seems to me there should be some liability for the person rurallib Oct 2024 #1
And that, too. Stopping a suicide is most often just making the victim take more time to accomplish it. marble falls Oct 2024 #5
I just can't even imagine the sorrow mixed with anger. I'd need serious sedation to even be able to speak. marble falls Oct 2024 #2
The chat bot previously admonished him against thoughts of suicide. John1956PA Oct 2024 #3
Yes, blame AI, not the step-dad TeslaNova Oct 2024 #4
The article makes reference to the gun's storage as follows: Jmb 4 Harris-Walz Oct 2024 #6
FL law means nothing iemanja Oct 2024 #7
Uh huh atreides1 Oct 2024 #8
Blame is better to give than recv Fullduplexxx Oct 2024 #9
Doing my best old man impression RAB910 Oct 2024 #10

rurallib

(64,688 posts)
1. Seems to me there should be some liability for the person
Fri Oct 25, 2024, 01:11 PM
Oct 2024

who left a gun such that a 14 year old could access it.

marble falls

(71,919 posts)
5. And that, too. Stopping a suicide is most often just making the victim take more time to accomplish it.
Fri Oct 25, 2024, 01:14 PM
Oct 2024

marble falls

(71,919 posts)
2. I just can't even imagine the sorrow mixed with anger. I'd need serious sedation to even be able to speak.
Fri Oct 25, 2024, 01:13 PM
Oct 2024

John1956PA

(4,964 posts)
3. The chat bot previously admonished him against thoughts of suicide.
Fri Oct 25, 2024, 01:13 PM
Oct 2024

In that fateful final chat, he was talking about suicide, and the chat bot response was a routine reply to a "come home" remark. It is all so horribly sad.

Jmb 4 Harris-Walz

(1,117 posts)
6. The article makes reference to the gun's storage as follows:
Fri Oct 25, 2024, 01:24 PM
Oct 2024
Shortly before his death, Setzer went looking for his phone, which his mother had confiscated and hidden, and instead found his stepfather’s gun. (Police later said the gun had been stored in compliance with Florida laws, according to the lawsuit.)

RAB910

(4,030 posts)
10. Doing my best old man impression
Fri Oct 25, 2024, 01:47 PM
Oct 2024

kids today are shit... Jesus fucking Christ, how in the hell did children become so soft that they could kill themselves over a fucking computer program. When I was young, things were a hell of a lot tougher and more dangerous, and you didn't see my generation offing themselves.

I appreciate that mental health and deaths by suicide shouldn't be taken lightly, nor are the topics for simplistic interpretations, but sometimes I just need to vent. On the serious side, it sure sounds like there were more issues at play than just the AI chatbot

Latest Discussions»General Discussion»Her teenage son killed hi...