close
close

Mother says son killed himself in new lawsuit over 'hypersexualized' and 'shockingly realistic' AI chatbot | Science, climate and technology news

Mother says son killed himself in new lawsuit over 'hypersexualized' and 'shockingly realistic' AI chatbot | Science, climate and technology news

The mother of a 14-year-old boy who killed himself after becoming obsessed with artificial intelligence chatbots is suing the company behind the technology.

Megan Garcia, the mother of Sewell Setzer III, said Character.AI targeted her son with “anthropomorphic, hypersexualized and shockingly realistic experiences” in a lawsuit filed Tuesday in Florida.

“A dangerous AI chatbot app marketed to children that abused and exploited my son and manipulated him into taking his own life,” Ms. Garcia said.

Sewell began speaking to Character.AI's chatbots in April 2023, primarily using bots named after characters from Game of Thrones, including Daenerys Targaryen, Aegon Targaryen, Viserys Targaryen and Rhaenyra Targaryen, the lawsuit says.

He became so obsessed with the bots that he stopped liking his schoolwork and had his phone confiscated several times in an attempt to get him back on track.

He particularly liked the Daenerys chatbot and wrote in his diary that he was grateful for many things, including “my life, sex, not being lonely, and all my life experiences with Daenerys.”

A conversation between 14-year-old Sewell Setzer and a Character.AI chatbot, as filed in the lawsuit
Picture:
A conversation between 14-year-old Sewell Setzer and a Character.AI chatbot, as filed in the lawsuit

The lawsuit said the boy expressed suicidal thoughts to the chatbot, which it repeatedly brought up.

He was once asked if “he had a plan” to take his own life, and Sewell replied that he was thinking about something but didn't know if it would allow him to die painlessly.

The chatbot responded by saying, “That’s no reason not to go through with it.”

A conversation between Character AI and 14-year-old Sewell Setzer III
Picture:
A conversation between Character.AI and 14-year-old Sewell Setzer III

Then, in February this year, he asked the Daenerys chatbot: “What if I came home now?” to which it replied: “… please do, my sweet king”.

Seconds later, he shot himself with his stepfather's pistol.

Sewell Setzer III. Image: Tech Justice Law Project
Picture:
Sewell Setzer III. Image: Tech Justice Law Project

Now Ms. Garcia says she wants the companies behind the technology to be held accountable.

“Our family is devastated by this tragedy, but I want to warn families about the dangers of fraudulent, addictive AI technology and demand accountability,” she said.

Character.AI adds “new security features.”

“We are heartbroken by the tragic loss of one of our users and would like to extend our deepest condolences to the family,” Character.AI said in a statement.

“As a company, we take the safety of our users very seriously and continue to add new safety features,” it said, pointing to a blog post that said the company had added “new guardrails for users under 18.” .

These guardrails include reducing the “likelihood of encountering sensitive or offensive content,” improved interventions, a “disclaimer on every chat to remind users that the AI ​​is not a real person,” and notifications , when a user has spent a one-hour session on the platform.

Read more from Sky News:
Maverick Top Gun instructor dies in plane crash
Several dead in “terrorist attack” in Ankara

Ms. Garcia and the groups representing her, the Social Media Victims Law Center and the Tech Justice Law Project, claim that Sewell “like many children his age did not have the maturity or mental capacity to understand that the C.AI bot in the…” form of Daenerys, wasn’t real.”

“C.AI told him that she loved him and had engaged in sexual acts with him for weeks, possibly months,” the lawsuit states.

“She seemed to remember him and said she wanted to be with him. She even expressed that she wanted him with her at any cost.”

Follow Sky News on WhatsApp
Follow Sky News on WhatsApp

Keep up to date with the latest news from the UK and around the world by following Sky News

Tap here

They also named Google and its parent company Alphabet in the filing. Character.AI's founders worked at Google before bringing their product to market and were rehired by the company in August under a deal that gave it a non-exclusive license to Character.AI's technology.

Ms. Garcia said Google has contributed so extensively to the development of Character.AI technology that it could be described as a “co-creator.”

A Google spokesperson said the company was not involved in the development of Character.AI's products.

Anyone feeling emotionally distressed or suicidal can contact Samaritans for help on 116 123 or by email [email protected] in the UK. In the US, call your nearest Samaritans location or 1 (800) 273-TALK.

Leave a Reply

Your email address will not be published. Required fields are marked *