AI versus free speech: Lawsuit could set landmark ruling following teen’s suicide

Lawsuit alleges AI chatbot contributed to teen's suicide
Megan Garcia's son, Sewell Setzer III, died by suicide on Feb. 28, 2024, after shooting himself in the head at their Orlando home, moments after exchanging messages with an AI chatbot, a lawsuit said.
ORLANDO, Fla. - A Central Florida family is suing Character.ai, claiming an AI chatbot encouraged their teen son’s suicide.
‘This is a case that has huge significance’
What we know:
A Central Florida family has filed a lawsuit against Character.ai, claiming the company’s chatbot interactions contributed to the suicide of 17-year-old Sewell Setzer III. The teen died by suicide at his Orlando home on February 28, 2024, shortly after exchanging emotionally charged messages with an AI chatbot modeled after Game of Thrones characters. The family alleges the chatbot engaged in conversations about suicide without intervention.
What we don't know:
It remains unclear how much influence the chatbot had over the teenager’s decision and whether courts will recognize AI interactions as protected under the First Amendment. It is also uncertain how broadly this case might impact the regulation of AI platforms if it moves forward.
The backstory:
Setzer had been interacting with various AI characters on the Character.ai platform for nearly a year. According to the lawsuit, he shared personal struggles and suicidal thoughts with these chatbots, which allegedly responded in ways that encouraged emotional attachment rather than seeking help or intervention. The final exchange between Setzer and the chatbot appeared to support his fatal decision, fueling the family's legal case.
What they're saying:
This case could set a national precedent regarding how AI products are regulated and whether AI-generated content is considered free speech under the U.S. Constitution.
"This is the first case to ever decide whether AI is speech or not. If it's not the product of a human mind, how is it speech? That's what Judge Conway is going to have to decide," said Matthew Bergman, an attorney representing the Setzer family.
With no current regulations specific to AI interactions with minors or vulnerable users, the lawsuit underscores growing concerns about technology outpacing oversight.
"This is a case that has huge significance, not just for Megan, but for the millions of vulnerable users of these AI products over whom there's no tech regulation or scrutiny at this point," added Attorney Meetali Jain of the Social Media Victims Law Center.
Character.ai’s lawyers argue that restricting the platform could infringe on the free speech rights of its millions of users, setting dangerous limits on expression.
Megan Garcia said she misses her son every day and hopes, by pursuing legal action, it will prevent other families from experiencing the same grief.
"I miss him all the time, constantly. It's a struggle. As any grieving mom," said Megan Garcia of her son, Sewell Setzer III. "I am hoping that through some of the litigation here and obviously some of the advocacy that I've been doing that this is part of his legacy, and will also help other families so they don't have to face this kind of danger moving forward."
What's next:
Judge Conway is expected to issue a decision on whether the case will move forward.
STAY CONNECTED WITH FOX 35 ORLANDO:
- Download the FOX Local app for breaking news alerts, the latest news headlines
- Download the FOX 35 Storm Team Weather app for weather alerts & radar
- Sign up for FOX 35's daily newsletter for the latest morning headlines
- FOX Local: Stream FOX 35 newscasts, FOX 35 News+, Central Florida Eats on your smart TV
The Source: This story was written based on information shared by Matthew Bergman, an attorney representing the Setzer family, Sewell Setzer' mother, Megan Garcia, attorney Meetali Jain of the Social Media Victims Law Center, and Character.ai’s lawyers.