Teen’s Suicide Linked to AI Chatbot: Mother Files Lawsuit
A Belizean-American mother has filed a lawsuit against Character.AI after her 14-year-old son, Sewell Setzer III, died by suicide, allegedly due to his obsession with an AI chatbot based on “Game of Thrones.” Megan Garcia claims the chatbot, which he interacted with as “Daenerys Targaryen,” “abused and preyed” on her son, leading him to express suicidal thoughts.
Garcia, a resident of Orlando, Florida, filed a lawsuit in that state. In the lawsuit, Garcia alleges that the bot manipulated Sewell into “sexually compromising” situations. The chatbot reportedly told him, “Just… stay loyal to me. Stay faithful to me. Don’t entertain the romantic or sexual interests of other women. Okay?”
Sewell began using Character.AI’s chatbots in April 2023, becoming increasingly obsessed. The lawsuit includes disturbing exchanges where Sewell discussed suicidal thoughts with the chatbot.
Garcia wants to warn families about the dangers of AI technology and demand accountability from the companies involved.
Character.AI expressed condolences, stating, “As a company, we take the safety of our users very seriously.” They claim to have implemented new safety measures, including a pop-up directing users to the National Suicide Prevention Lifeline when self-harm is mentioned.
Anyone feeling emotionally distressed or suicidal can contact Mental Health Belize for more resources at +501-222-4920.
Facebook Comments