When one AI-powered chatbot debuted in 2023, it went viral for a disturbing conversation in which it declared its love for a New York Times columnist and urged him to leave his wife. Now, a mother in Texas has filed a lawsuit against another artificial intelligence company after its chatbot allegedly suggested to her teenage son that he should kill her in retaliation for limited screen time.
As detailed by The Independent, the suit alleges that a chatbot on the Character.AI app made a number of troublesome suggestions to the teen, who was 15 years old at the time. It reportedly stated that self-harm "felt good for a moment" after her son revealed he was cutting himself, engaged in sexual conversations, and tried to convince him his family didn't have his best interests at heart, telling him that his parents were "ruining your life."
The lawsuit also states that the teen, now 17, has autism and was high-functioning before becoming addicted to his phone, losing 20 pounds in a few months. His parents also say he began biting and punching them. They took away his phone in fall 2023 after discovering a troubling exchange.
"This is every parent's nightmare," Social Media Victims Law Center founder Matthew Bergman, who is representing the family, told the New York Post in December, adding that the teen, named as JF in the suit, is now at an inpatient mental health facility after he began experiencing "severe anxiety and depression for the first time in his life" following his interactions with the chatbot.
Proponents of AI highlight how it has immense potential for good. For instance, its ability to parse large volumes of data is helping researchers find ways to maximize crop growth while reducing the need for chemical pesticides. The tech may also lead to more accurate predictive models for extreme weather events, which have grown more intense with the global temperature on the rise.
Critics, however, point out that these systems need immense amounts of power to operate, and much of that energy comes from polluting dirty fuels that are driving planetary warming. Meanwhile, AI cooling systems require large volumes of water, and experts fear AI computing needs will lead to a surge of toxic e-waste.
Moreover, internet users have regularly called out AI for providing inaccurate answers to factual questions and for using violent language.
The lawsuit by JF's family is not the first related to AI, per the Post. Less than two months before JF's family filed, a mom in Florida brought a separate suit against Character.AI, alleging one of its chatbots caused her 14-year-old son, Sewell Setzer III, to kill himself. The mother of an 11-year-old girl also joined as a plaintiff in JF's filing in hopes of removing Character.AI from circulation until it amends its dangers, including by limiting access to children.
"The family has one goal and one goal only — which is shut this platform down. This platform has no place in the hands of kids. Until and unless Character.AI can demonstrate that only adults are involved in this platform it has no place on the market," Bergman said in a statement.
🗣️ Do you think kids spend too much time in front of screens?
🔘 Yes — it's rotting their brains 🧠
🔘 No — screens are the future 💻
🔘 There should be a good balance ⚖️
🔘 No idea — I don't have kids 🤷
🗳️ Click your choice to see results and speak your mind
For its part, Character.AI would not issue a statement on the pending lawsuits but told the Post that it is working toward limiting teen exposure to "sensitive" content and aims to "provide a space that is both engaging and safe."
Join our free newsletter for easy tips to save more and waste less, and don't miss this cool list of easy ways to help yourself while helping the planet.