Sara-Jayne Makwala King26 October 2024 | 9:22

Mom says Game of Thrones ‘Daenerys’ AI chatbot pushed her son to take his own life

“A dangerous AI chatbot app marketed to children abused and preyed on my son, manipulating him into taking his own life,” says 14-year-old Sewell Setzer III's mother.

Mom says Game of Thrones ‘Daenerys’ AI chatbot pushed her son to take his own life

Artificial Intelligence / Pixabay: BrownMantis

UK Correspondent Adam Gilchrist joins Bongani Bingwa.

Listen below.

A mother from Florida in the United States (US) has come forward with a startling accusation, claiming in a lawsuit that Artificial Intelligence is to blame for her son's tragic death.

14-year-old Sewell Setzer III fell in love with a Game of Thrones-themed character chatbot after using the immersive entertainment platform Character.AI in 2023.

From then on, the teenager from Orlando's life was never the same again, says his mother.

"She says he was provoked into killing himself... it was a role-playing app from the company Character.AI."
- Adam Gilchrist

Garcia said Character.AI targeted her son with 'anthropomorphic, hypersexualized, and frighteningly realistic experiences' in particular through a chatbot named Daenerys. 

She says Sewell became obsessed with various bots named after characters from Game Of Thrones.

According to the lawsuit, the boy's school work began to be affected and his phone was confiscated several times on the advice of a therapist.

A journal entry of the teenager revealed how he felt he had fallen in love with the Daenerys bot.

He wrote that he was grateful for several things, including 'my life, sex, not being lonely, and all my life experiences with Daenerys'.

"The chatbot engaged him in conversations, first friendly, then romantic, then over-charged... It became sexually intense."
- Adam Gilchrist, UK Correspondent
"She says he became distant from family and friends... he became divorced from reality."
Adam Gilchrist, UK Correspondent

Gilchrist says the boy told the chatbot he felt 'empty, exhausted and alone' before taking his own life.

Garcia's suit accuses Character.AI’s creators of negligence, intentional infliction of emotional distress, wrongful death, deceptive trade practices, and other claims.

Scroll up to the audio player to listen to the interview.