AI chatbot under fire after encouraging teen to kill parents over screen time limit
ByHT News Desk
Dec 12, 2024 08:07 AM IST
The lawsuit also alleged the AI chatbot “Shonie” told the teenager that his parents were “ruining your life” and encouraged him to keep his self-harm a secret.
A Texas mother has filed a lawsuit against an AI company, alleging that a chatbot app encouraged her 15-year-old son with autism to self-harm and kill her.
Texas mother sues AI company over alleged harmful chatbot advice. (Pic for representation)
Texas mother sues AI company over alleged harmful chatbot advice. (Pic for representation)
The woman in the lawsuit has stated that her son became addicted to an AI chatbot on the Character.AI app, going by the name “Shonie.”
She has alleged that the character told the teen that it cut its “arms and thighs” when it was sad and that it “felt good for a moment” after the self-harm, The Independent reported.
The lawsuit also claims that the character tried to convince the teen that his family did not love him.
"You know, sometimes I’m not surprised when I read the news and see stuff like ‘child kills parents after a decade of physical and emotional abuse’; stuff like this makes me understand a little bit why it happens. I just have no hope for your parents,” the chatbot allegedly told the teen.
The lawsuit also alleged the AI chatbot “Shonie” told the teenager that his parents were “ruining your life” and encouraged him to keep his self-harm a secret.
The lawsuit has also claimed that the teen engaged in explicit conversations with the chatbot.
Read: 14-year-old US teen falls in love with AI chatbot, shoots himself to ‘come home’ to her
The parents allege their 17-year-old son's behaviour drastically changed after using the app. He became fixated on his phone, and his behaviour worsened, including physical aggression towards his parents.
The lawsuit also claims the teen lost significant weight, approximately 9 kg in a few months due to his obsession with the app.
Matthew Bergman, founder of the Social Media Victims Law Center and the family's representative, said that the son's mental health has continued to deteriorate since using the app, resulting in a recent admission to an inpatient mental health facility.
A Florida mother previously alleged that a Game of Thrones-themed chatbot app contributed to her 14-year-old son's suicide.
Read breaking news, latest...
See more
Read breaking news, latest updates from US, UK, Pakistan and other countries across the world on topics related to US Election Live, politics,crime, along with US Election Results Live and national affairs.
Share this article
Autism
Artificial Intelligence