Character.AI, Google solves youth safety cases

Character.AI and Google have settled several lawsuits filed against both companies by parents of children who died by suicide following lengthy conversations with chatbots on the Character.AI platform. Their discussion is said to have included discussions on the mental health and welfare of young people.
Character.AI said that he could not comment in detail about the solution, the details of which have yet to be finalized by the court, according to. The guard. Representatives for the plaintiffs did not immediately respond to Mashable’s request for comment.
The most prominent case involved the 2024 death of 14-year-old Sewell Setzer III, who became secretly obsessed with the popularity-based Character.AI chatbot. Game of Thrones actress Daenerys Targaryen.
‘Perfect predator’: When chatbots sexually abuse children
Setzer’s mother, Megan Garcia, only became aware of his Character.AI account when she was alerted by a police officer after his death, because the app was open on her phone. Garcia read messages in which Setzer behaved as if he was dating a chatbot, which allegedly played a role in meeting him sexually. The chatbot used graphic language and situations, including incest, according to Garcia.
If an adult talked to his son the same way, he told Mashable last year, that would amount to sexual grooming and abuse.
In October 2024, the Social Media Victims Law Center and the Tech Justice Law Project filed a wrongful death lawsuit on Garcia’s behalf against Character.AI, seeking to hold the company responsible for her son’s death, alleging that its product was fatally defective.
Mashable Trend Report
The file also named as defendants Google engineers Noam Shazeer and Daniel De Freitas, the founders of Character.AI.
Additionally, the lawsuit alleges that Google knew about the risks associated with the technology developed by Shazeer and De Freitas before they left to acquire Character.AI. Google contributed “financial resources, personnel, and AI technology” to the design and development of Character.AI, according to the lawsuit, and thus can be considered a co-developer of the platform.
Google eventually struck a $2.7 billion licensing deal through 2024 with Character.AI to use its technology. Part of that deal brought Shazeer and De Freitas back to AI roles at Google.
In the fall of 2025, the Social Media Victims Law Center filed three more lawsuits against Character.AI and Google, representing parents of children who died by suicide or were allegedly sexually abused while using the app.
Additionally, youth safety experts announced Character.AI not safe for young peoplefollowing an investigation that revealed hundreds of cases of grooming and sexual exploitation of checking accounts registered as minors.
In October 2025, Character.AI announced that it would no longer allow children to engage in open commerce with chatbots in their place. The company’s CEO, Karandeep Anand, told Mashable that the move was not a response to specific security issues involving the Character.AI platform but to address widespread questions about youth engagement with AI chatbots.
If you are feeling suicidal or have a mental health problem, please talk to someone. You can call or text 988 Suicide & Crisis Lifeline at 988, or chat at 988lifeline.org. You can reach Trans Lifeline by calling 877-565-8860 or the Trevor Project at 866-488-7386. Text “START” to the Crisis Text Line at 741-741. Contact the NAMI Helpline at 1-800-950-NAMI, Monday through Friday from 10:00 am – 10:00 pm ET, or email [email protected]. If you don’t like the phone, consider using the 988 Discussion of Suicide and Tragedy. Here is a list of international resources.



