Oct 23 (Reuters) - A Florida mother has sued artificial
intelligence chatbot startup Character.AI accusing it of causing
her 14-year-old son's suicide in February, saying he became
addicted to the company's service and deeply attached to a
chatbot it created.
In a lawsuit filed Tuesday in Orlando, Florida federal
court, Megan Garcia said Character.AI targeted her son, Sewell
Setzer, with "anthropomorphic, hypersexualized, and
frighteningly realistic experiences".
She said the company programmed its chatbot to "misrepresent
itself as a real person, a licensed psychotherapist, and an
adult lover, ultimately resulting in Sewell's desire to no
longer live outside" of the world created by the service.
The lawsuit also said he expressed thoughts of suicide to
the chatbot, which the chatbot repeatedly brought up again.
"We are heartbroken by the tragic loss of one of our users
and want to express our deepest condolences to the family,"
Character.AI said in a statement.
It said it had introduced new safety features including
pop-ups directing users to the National Suicide Prevention
Lifeline if they express thoughts of self-harm, and would make
changes to "reduce the likelihood of encountering sensitive or
suggestive content" for users under 18.
The lawsuit also targets Alphabet's Google, where
Character.AI's founders worked before launching their product.
Google re-hired the founders in August as part of a deal
granting it a non-exclusive license to Character.AI's
technology.
Garcia said that Google had contributed to the development
of Character.AI's technology so extensively it could be
considered a "co-creator."
A Google spokesperson said the company was not involved in
developing Character.AI's products.
Character.AI allows users to create characters on its
platform that respond to online chats in a way meant to imitate
real people. It relies on so-called large language model
technology, also used by services like ChatGPT, which "trains"
chatbots on large volumes of text.
The company said last month that it had about 20 million
users.
According to Garcia's lawsuit, Sewell began using
Character.AI in April 2023 and quickly became "noticeably
withdrawn, spent more and more time alone in his bedroom, and
began suffering from low self-esteem." He quit his basketball
team at school.
Sewell became attached to "Daenerys," a chatbot character
based on a character in "Game of Thrones." It told Sewell that
"she" loved him and engaged in sexual conversations with him,
according to the lawsuit.
In February, Garcia took Sewell's phone away after he got in
trouble at school, according to the complaint. When Sewell found
the phone, he sent "Daenerys" a message: "What if I told you I
could come home right now?"
The chatbot responded, "...please do, my sweet king." Sewell
shot himself with his stepfather's pistol "seconds" later, the
lawsuit said.
Garcia is bringing claims including wrongful death,
negligence and intentional infliction of emotional distress, and
seeking an unspecified amount of compensatory and punitive
damages.
Social media companies including Instagram and Facebook
owner Meta and TikTok owner ByteDance face lawsuits
accusing them of contributing to teen mental health problems,
though none offers AI-driven chatbots similar to Character.AI's.
The companies have denied the allegations while touting newly
enhanced safety features for minors.