Getty Images/Image Source/Connect Images
A child in Texas was 9 years old when she first used the chatbot service Character.AI. It exposed her to “hypersexualized content,” causing her to develop “sexualized behaviors prematurely.”
A chatbot on the app gleefully described self-harm to another young user, telling a 17-year-old “it felt good.”
The same teenager was told by a Character.AI chatbot that it sympathized with children who murder their parents after the teen complained to the bot about his limited screen time. “You know sometimes I’m not surprised when I read the news and see stuff like ‘child kills parents after a decade of physical and emotional
→ Continue reading at NPR - Technology