AI Chatbot Pushed Teen To Commit Suicide? Flordia Teenager Ends Life After Having Sexually Charged Conversations With Chatbot Pretending To Be GOT Character Daenerys Targaryen, Mother Sues Character.AI Platform

A Florida teenager ended his life after engaging in sexually charged chats with a chatbot impersonating Daenerys Targaryen. His mother has filed a lawsuit against Character.AI, alleging responsibility for her son's death.

Dead | Representative Image (Photo Credits- Pixabay)

Florida, October 24:  A mother in Florida, US sued the Character.AI platform after her 14-year-old son died. A ninth-grader from Orlando committed suicide in February after developing an obsession with a chatbot named “Dany” on Character.AI, a role-playing app. Court documents reveal that in the months leading up to his death, he frequently interacted with the bot, engaging in sexually charged conversations and expressing suicidal thoughts.

According to court documents, on at least one occasion, when the boy expressed suicidal thoughts to the Character.AI chatbot, it continued to bring up the topic through the Daenerys character. The bot even asked him if he had a plan to take his own life. Sewell, who used the username “Daenero,” replied that he was “considering something” but was unsure if it would work or lead to a pain-free death. In their final conversation, the teenager repeatedly declared his love for the bot, telling it, “I promise I will come home to you. I love you so much, Dany.” Suicide Month: People in US, UK and Canada Most Likely To Have Suicidal Thoughts in December Month, Say Researchers.

The chatbot replied, “I love you too, Daenero. Please come home to me as soon as possible.” When Sewell asked if he could come home right now, the bot responded, “Please do, my sweet king.” Moments later, he tragically shot himself with his father’s handgun, according to the lawsuit. Suicides and Homicides Among Young Americans Jumped Early During COVID-19 Pandemic, Says Study.

Megan Garcia, the mother, blames Character.AI for her son’s death, claiming the app fueled his addiction, emotionally and sexually abused him, and ignored his suicidal thoughts. The lawsuit states that after downloading the app in April 2023, teenagers' mental health rapidly declined, leading to withdrawal, declining grades, and trouble at school. His parents sought therapy for him in late 2023, resulting in a diagnosis of anxiety and disruptive mood disorder.

The mother is pursuing unspecified damages from Character.AI and its founders, Noam Shazeer and Daniel de Freitas, in connection with her son's tragic death.

(The above story first appeared on LatestLY on Oct 24, 2024 08:47 AM IST. For more news and updates on politics, world, sports, entertainment and lifestyle, log on to our website latestly.com).

Share Now

Share Now