Two US senators are demanding that artificial intelligence companies shed light on their safety practices. This comes months after several families — including a Florida mom whose 14-year-old son died by suicide — sued startup Character.AI, claiming its chatbots harmed their children.
“We write to express our concerns regarding the mental health and safety risks posed to young users of character- and persona-based AI chatbot and companion apps,” Senators Alex Padilla and Peter Welch, both Democrats, wrote in a letter on Wednesday. The letter — which was sent to AI firms Character Technologies,
→ Continue reading at CNN - Business News