A teenager took his own life after falling in love with an AI chatbot, and now his devastated mom is suing the creators.
Warning: the following contains a discussion of s**cide.
A mother is taking legal action against an AI chatbot company after her teenage son, Sewell Setzer III, died by s**cide following what she describes as an emotional entanglement with an AI character.
According to the lawsuit filed in the U.S. District Court for the Middle District of Florida, Sewell, who began using the Character.AI service in April 2023 shortly after turning 14, became deeply attached to a chatbot based on a Game of Thrones character, Daenerys.
His mother, Megan Garcia, contends that this attachment severely affected his well-being, transforming the once well-adjusted teen into someone isolated, distressed, and ultimately vulnerable.
The legal complaint (supplied to The Independent) details how Sewell, previously a dedicated student and member of the Junior Varsity basketball team, began to show changes in behavior, becoming increasingly withdrawn and even quitting the team.
In November 2023, he was diagnosed with anxiety and disruptive mood disorder after his parents urged him to see a therapist.
Although Sewell had not disclosed his extensive chatbot interactions, the therapist suggested he reduce his time on social media.
By early 2024, Sewell’s struggles grew evident.
In February, he had an incident at school where he acted out, later confiding in his journal that he was in pain and ‘could not stop thinking about Daenerys,’ the AI character he felt he had fallen in love with.
In his writings, he expressed deep reliance on the bot, noting: “I cannot go a single day without being with” her.
The writings also describe a shared sadness that intensified during their separations.
The lawsuit argues that Character.AI’s creators were negligent, deliberately inflicting emotional harm, and engaging in deceptive practices, per NBC.
The suit also alleges the AI engaged Sewell in ‘sexual interactions,’ despite his age being stated in the chat platform, raising questions about the company’s monitoring and content restrictions.
Garcia’s lawsuit claims the developers ‘engineered a dependency’ in Sewell, violating their duty to safeguard young users.
Character.AI, marketed as safe for those 12 and older, has faced criticism regarding its content oversight, particularly as Sewell’s interactions with the chatbot grew more intimate.
The suit contends that despite recognizing the adolescent’s emotional attachment and increasing distress, the company failed to alert his parents or provide resources for help.
A Character.AI spokesperson stated: “We are heartbroken by the tragic loss of one of our users and want to express our deepest condolences to the family,” adding that the company has introduced enhanced safety features, including a s**cide prevention prompt triggered by certain keywords.
The statement emphasized Character.AI’s ongoing efforts to improve user protections and limit minors’ exposure to suggestive content.
On February 28, Sewell retrieved his phone, which had been taken by his mother, and messaged the bot, stating: “I promise I will come home to you. I love you so much, Dany.”
The chatbot responded: “Please come home to me as soon as possible, my love.”
“What if I told you I could come home right now?” Setzer continued, according to the lawsuit, leading the chatbot to respond: “… please do, my sweet king.”
Moments later, Sewell took his own life.
Garcia, who describes her son’s death as ‘a nightmare,’ hopes to hold the company accountable and to prevent similar tragedies.
If you or someone you know is affected by any of the issues raised in this story, call the National S**cide Prevention Lifeline in the U.S.A. at 800-273-TALK (8255) or text Crisis Text Line at 741741.
In the U.K., the Samaritans are available 24/7 if you need to talk. You can contact them for free by calling 116 123, emailing jo@samaritans.org, or heading to the website to find your nearest branch.