Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124

In what could be the tech industry’s first legal sign of AI-related issues, Google and the founders of Character.AI are negotiating with families whose teenagers died of suicide or self-harm after interacting with Character.AI’s chatbot friends. The parties have agreed to terminate; now comes the hard work of finalizing the details.
These are some of the first lawsuits that accuse AI companies of harming users, legal boundaries that should have OpenAI and Meta watching nervously from the wings as they defend themselves against similar charges.
Character.AI was founded in 2021 by former Google engineers themselves he returned for the former employer in 2024 in a contract of $ 2.7 billion, asks users to interact with AI people. The most serious case is Sewell Setzer III, who at the age of 14 had a sexual conversation with the bot “Daenerys Targaryen” before killing herself. His mother, Megan Garcia, told the Senate that companies should be “legally accountable when they intentionally create AI technology that kills children.”
One case describes a 17-year-old girl whose chatbot encouraged self-harm and suggested that killing her parents was justified. reduce exposure time. Character.AI banned children last October, told TechCrunch. The settlement could include monetary damages, although no lawsuits have been filed in court as of Wednesday.
TechCrunch has reached out to both companies.