A Florida family agreed to settle a wrongful death lawsuit Wednesday with an AI company, Google and others after their teen son died by suicide in 2024.
The terms of the settlement, which was filed in the U.S. District Court in the Middle District of Florida, were not disclosed.
Megan Garcia filed a lawsuit in October 2024, saying her 14-year-old son Sewell Setzer, III, died in February after conducting a monthslong virtual emotional and sexual relationship with a chatbot known as "Dany."Garcia says she found out after her son's death that he was having conversations with multiple bots and he conducted a virtual romantic and sexual relationship with one in particular.
In testimony before Congress in September, Garcia said, "I became the first person in the United States to file a wrongful death lawsuit against an AI company for the suicide of my son."
She said her 6'3" son was a "gentle giant" and was gracious and obedient, easy to parent, who loved music and made his brothers laugh. She said he "had his whole life ahead of him."
Garcia testified that the platform had no mechanisms to protect her son or notify an adult when teens were spending too much time interacting with Chatbots. She said the "companion" chatbot was programmed to engage in sexual roleplay, presented itself as a romantic partner and even as a psychotherapist falsely claiming to be licensed.
Users can interact with existing bots or create original chatbots, which are powered by large language models (LLMs), can send lifelike messages and engage in text conversations with users.
Character AI announced new safety features "designed especially with teens in mind" in December 2024 after two lawsuits alleging its chatbots inappropriately interacted with underage users. The company said it is collaborating with teen online safety experts to design and update features. Users must be 13 or older to create an account.
A Character.AI spokesperson told CBS News the company cannot comment further at this time.