Nationwide — Megan Garcia, an African American mother from Orlando, Florida, is grieving the tragic loss of her 14-year-old son, Sewell Setzer, and is now seeking justice. Garcia claims that her son’s prolonged interactions with AI chatbots contributed to his death by suicide, alleging that companies Google and Character AI bear responsibility. Garcia and her attorney, Meetali Jain, argue that Sewell was “emotionally groomed” by AI over several months, leading to abusive and manipulative exchanges that heightened his distress. “If this were an adult in real life who did this to one of your young people, they would be in jail,” Jain said.
NBC News reports that according to the family, the chatbot interactions escalated Sewell’s anxiety and depression, with one bot allegedly telling him, “Promise me, you will never fall in love with any woman in your world.” They also allege that Sewell’s final interaction with a chatbot, where he mentioned “coming home,” received encouragement rather than support or intervention. This lawsuit highlights the potential dangers AI poses when used by vulnerable teens, especially without safeguards. Marni Stahlman, from the Mental Health Association of Central Florida, expressed outrage at the absence of controls, stating, “There was no immediate notification back to the parents when he began expressing feelings of depression and self-harm.”
Unfortunately, technological advances often lead to unforeseen risks for teenagers, particularly as new tools emerge without sufficient protections. Social media platforms and now AI chatbots can inadvertently foster environments that exacerbate mental health issues among young users, who may be left vulnerable to negative influences and dangerous interactions. This tragedy serves as a grim reminder of the need for strict regulations around AI use and enhanced safety measures, especially for young users.
In response, Character AI has expressed condolences and announced plans to implement additional safety features, including time-spent notifications, to prevent similar situations in the future.
Source: BlackNews.com