Lawsuit: A chatbot hinted a kid should kill his parents over screen time limits

Getty Images/Image Source/Connect Images

A child in Texas was 9 years old when she first used the chatbot service Character.AI. It exposed her to “hypersexualized content,” causing her to develop “sexualized behaviors prematurely.”

A chatbot on the app gleefully described self-harm to another young user, telling a 17-year-old “it felt good.”

The same teenager was told by a Character.AI chatbot that it sympathized with children who murder their parents after the teen complained to the bot about his limited screen time. “You know sometimes I’m not surprised when I read the news and see stuff like ‘child kills parents after a decade of physical and emotional

→ Continue reading at NPR - Technology

More from author

Related posts

Advertisment

Latest posts

The Eras Tour’s greatest legacy may be the (possible) breakup of Ticketmaster | CNN Business

A version of this story appeared in CNN Business’ Nightcap newsletter. To get it in your inbox, sign up for free, here. ...

A phone company developed an AI ‘granny’ to beat scammers at their own game

A screenshot of the AI-generated "Daisy" — a new program launched by British...

These robots could fix grape farmers’ labor woes

Enlarge this image A robot roams through rows of grapes at...