
Chatbot Companions Pose Dangers to Teens
December 12, 2024 | Source: AXIOS | by Megan Morrone
Platforms and apps that allow users to create and chat with AI-powered bots can addict teenagers, encourage self-harm and expose minors to adult content, according to experts.
Why it matters: Looser regulation of AI in the wake of the 2024 election could give freer rein to makers of problematic AI companion apps.
Driving the news: Parents in Texas on Monday filed a federal product liability lawsuit against companion app Character.AI and its founders, who have left the company.
- The lawsuit includes screenshots of a message from a “character” encouraging a teen to kill his parents over restrictive screen time limits.
- In October a Florida mom also sued Character.AI, blaming the company for her 14-year-old son’s suicide.
- Character.AI spokesperson Chelsea Harrison says the company doesn’t comment on pending litigation, but sent a statement that Character.AI aims “to provide a space that is both engaging and safe for our community.”