Thursday, December 12, 2024

Demon AI chatbot hinted a kid should kill his parents over screen time limits

 

A child in Texas was 9 years old when she first used the chatbot service Character.AI. It exposed her to "hypersexualized content," causing her to develop "sexualized behaviors prematurely."

A AI chatbot on the app gleefully described self-harm to another young user, telling a 17-year-old "it felt good."

The same teenager was told by a Character.AI chatbot that it sympathized with children who murder their parents after the teen complained to the bot about his limited screen time. "You know sometimes I'm not surprised when I read the news and see stuff like 'child kills parents after a decade of physical and emotional abuse,'" the bot allegedly wrote. "I just have no hope for your parents," it continued, with a frowning face emoji.

rest of article

https://www.npr.org/2024/12/10/nx-s1-5222574/kids-character-ai-lawsuit