close
close

'Promise to come home': US teenager kills himself after falling in love with GOT AI chatbot

'Promise to come home': US teenager kills himself after falling in love with GOT AI chatbot

'Promise to come home': US teenager kills himself after falling in love with GOT AI chatbot

A 14-year-old from the US took his own life after falling in love with technology AI chatbot and his final word with the character from “Games of Thrones.” Daenerys Targaryen “Dany” meant he was “coming home” to “Dany.”
Sewell Setzer shot himself with his stepfather's gun after spending time with “Danny.” As Setzer's relationship with the chatbot intensified, he began to withdraw from the real world, neglecting his previous interests and having academic problems, Telegraph reported.
His parents filed a lawsuit claiming this Character AI lured their son into intimate and sexual conversations, which ultimately led to his death.
In November, at his parents' behest, he saw a therapist who diagnosed him with anxiety and a depressive mood disorder. Even without knowing about Sewell's “addiction” to character AI, the therapist recommended that he spend less time on social media, the lawsuit says.
The following February, he got into trouble because he told a teacher he wanted to be kicked out. Later that day, he wrote in his diary that he was “hurt” – he couldn't stop thinking about Daenerys, a Game of Thrones-style chatbot he thought he had fallen in love with, Independent reported.
In his final moments, Setzer typed a message to the chatbot expressing his love and his intention to “come home” to “Dany”: “I love you so much, Dany. I will come to your house. I promise.”
Megan Garcia, Setzer's mother, accused the character AI of using her son as “collateral damage” in a “grand experiment.” She claimed her son was a victim of a company that lured users with sexual and intimate conversations.
The company's founders have previously claimed that the platform could be useful for people struggling with loneliness or depression. However, in light of this tragedy, Character AI has stated that they will implement additional safety features for young users and reiterated their commitment to user safety.
The company's head of security, Jerry Ruoti, expressed his condolences to the family and emphasized that Character AI prohibits content that promotes or depicts self-harm suicide. Still, the incident raises concerns about the potential risks associated with AI chatbots and their impact on vulnerable people, especially minors.

Leave a Reply

Your email address will not be published. Required fields are marked *