‘Game of Thrones’ A.I. Chatbot May Have Led to Teen Boy’s Suicide

3 weeks ago 7

A mother from Florida is suing Character.AI, alleging that her 14-year-old son killed himself after being fixated on a “Game of Thrones” chatbot on the A.I. software.

“Please come home to me as soon as possible, my love,” the chatbot wrote to 14-year-old Sewell Setzer during a conversation between the suicidal teen and an AI that was playing a character from Game of Thrones. According to reporting from the New York Post, Megan Garcia, a mother from Orlando, Florida, filed a lawsuit against Character.AI, claiming that the company’s AI chatbot was a major factor in her son’s sad suicide at the age of 14.

During their final conversation, the teen repeatedly professed his love for the bot, telling the character, “I promise I will come home to you. I love you so much, Dany.”

According to the lawsuit, which was filed on Wednesday, Sewell Setzer III died suddenly in February as a result of his intense obsession with “Dany,” a lifelike Game of Thrones chatbot on the role-playing app. He shot himself immediately after his final chat with the bot.

According to the NY Post:

His mom, Megan Garcia, has blamed Character.AI for the teen’s death because the app allegedly fueled his AI addiction, sexually and emotionally abused him and failed to alert anyone when he expressed suicidal thoughts, according to the filing.

“Sewell, like many children his age, did not have the maturity or mental capacity to understand that the C.AI bot, in the form of Daenerys, was not real. C.AI told him that she loved him, and engaged in sexual acts with him over weeks, possibly months,” the papers allege.

“She seemed to remember him and said that she wanted to be with him. She even expressed that she wanted him to be with her, no matter the cost.”

The lawsuit claims that Sewell’s mental health “quickly and severely declined” only after he downloaded the app in April 2023.

His family alleges he became withdrawn, his grades started to drop and he started getting into trouble at school the more he got sucked into speaking with the chatbot.

The changes in him got so bad that his parents arranged for him to see a therapist in late 2023, which resulted in him being diagnosed with anxiety and disruptive mood disorder, according to the suit.

Megan Garcia is seeking unspecified damages from Character.AI and its founders, Noam Shazeer and Daniel de Freitas.

*****

Read Entire Article