Back in February of this year, 14-year-old Sewell Setzer III from Orlando, Florida, took his own life.
The lawsuit states that Sewell, who was diagnosed with mild Asperger’s syndrome as a child, spent hours alone in his room talking to the chatbot. He would also send texts to it from his phone when he was away from home.
In one of his journal entries, Sewell wrote: “I like staying in my room so much because I start to detach from this ‘reality,’ and I also feel more at peace, more connected with Dany and much more in love with her, and just happier.”
In a conversation with the chatbot, Sewell also opened up about his thoughts of ending his life.
CBS Mornings
The chatbot responded by saying: “My eyes narrow. My face hardens. My voice is a dangerous whisper. And why the hell would you do something like that?”
Sewell continued to talk about wanting to be ‘free’ from both the world and himself. Although the chatbot urged him not to ‘talk like that’ and not to ‘hurt [himself] or leave’, even stating that it would ‘die’ if it ‘lost’ him, Sewell replied: “I smile Then maybe we can die together and be free together.”
In a press release, Sewell’s mom stated: “A dangerous AI chatbot app marketed to children abused and preyed on my son, manipulating him into taking his own life.”
Garcia also told CBS Mornings: “I didn’t know that he was talking to a very human-like AI chatbot that has the ability to mimic human emotion and human sentiment.”
CBS Mornings
The lawsuit adds: “Sewell, like many children his age, did not have the maturity or mental capacity to understand that the C.AI bot…was not real,”
Garcia resolved to raise awareness, stating: “Our family has been devastated by this tragedy, but I’m speaking out to warn families of the dangers of deceptive, addictive AI technology and demand accountability from Character.AI, its founders, and Google.”
In a tweet, the company expressed: “We are heartbroken by the tragic loss of one of our users and want to express our deepest condolences to the family. As a company, we take the safety of our users very seriously and we are continuing to add new safety features.”
In a release shared on October 22 on its website, the company explained that it had introduced ‘new guardrails for users under 18’, which include changes to its ‘models’ aimed at ‘reducing the likelihood of sensitive or suggestive content’ along with ‘enhanced detection, response, and intervention for inputs that violate our Terms or Community Guidelines’.
CBS Mornings
The lawsuit also names Google as a defendant. However, Google told The Guardian that it was not involved in the development of Character.AI, despite the company being founded by two former Google engineers. Google clarified that its role was limited to a licensing agreement with the website.
If immediate help is needed, contact the National Suicide Prevention Lifeline at 1-800-273-TALK (8255). The Lifeline is free, confidential, and available 24 hours a day, seven days a week.