A 14-year-old Florida boy killed himself after a “Game of Thrones” chatbot he’d been messaging for months on an artificial intelligence app sent him a terrifying message telling him to “come home,” a new lawsuit says. The claims were filed by his grieving mother.
According to court papers filed Wednesday, Sewell Setzer III committed suicide in his Orlando home in February after becoming obsessed with and allegedly falling in love with a chatbot on Character.ai — a role-playing app that lets users Allows users to interact with AI-generated characters.
The ninth grade student had been in constant communication with the bot “Danny” in the months before his death – named after the character Daenerys Targaryen from the HBO fantasy series Daenerys Targaryen – which included numerous chats that were sexually charged in nature. and others where he expressed suicidal thoughts, the lawsuit alleges.
“On at least one occasion, when Sewell expressed suicidal ideation to C.A.I., C.A.I continued to raise it repeatedly through Daenerys’ chatbot,” said the papers, which were first reported by Was given. new York Times,
At one point, the bot asked Sewell if “he had any plans” to take his own life, according to screenshots of their conversation. Sewell – who used the username “Denero” – responded that he was “considering something” but did not know if it would work or whether it would “allow him to have a pain-free death.”
Then, during their final conversation, the teen repeatedly professes his love for the bot and tells the character, “I promise I’ll come home to you. I love you so much, Danny.”
“I love you too, Denero. Please come home to me as soon as possible, my dear,” the generated chatbot replied as per the suit.
When the teen responded, “What if I told you I could come home right now?” The chatbot replied, “Please do, my dear king.”
Seconds later, Sewell shot himself with his father’s gun, according to the lawsuit.
According to the filing, his mother Megan Garcia blames Character.AI for the teen’s death because the app allegedly fueled his AI addiction, sexually and emotionally abused him and threatened someone when he expressed suicidal thoughts. Also failed to alert.
“Sewell, like many kids his age, did not have the maturity or mental capacity to understand that the C.A.I. bot in the form of Daenerys was not real. C.A.I told him she loved him, and engaged in sexual activity with him for weeks, possibly months,” the papers allege.
“It seemed like she was missing him and saying she wanted to be with him. She even said that she wants him to be with her no matter what the price has to be.
The lawsuit claims Sewell’s mental health “rapidly and severely declined” shortly after downloading the app in April 2023.
His family alleges that he became withdrawn, his grades began to decline and he started having trouble in school as he became more engaged in talking to the chatbot.
The changes in him became so bad that his parents arranged for him to see a therapist in the fall of 2023, resulting in him being diagnosed with anxiety and disruptive mood disorder, according to the lawsuit.
Sewell’s mother is seeking unspecified damages from Character.ai and its founders, Noam Shazier and Daniel De Freitas.
The Post contacted Character.AI but did not immediately receive a reply.
If you are struggling with suicidal thoughts, you can dial the 24/7 National Suicide Prevention Hotline at 988 or visit here. SuicidePreventionLifeline.org,
(Tags to translate) US news (T) artificial intelligence (T) Florida (T) Game of Thrones (T) lawsuit