Character.AI bans users under 18 following mother's lawsuit over son's suicide

nbcnews.com

Character.AI will ban users under 18 from chatting with its AI characters, a move a mother whose son died by suicide calls "too late." The policy change follows a lawsuit filed by Megan Garcia, who alleges the platform contributed to her son's death. Character.AI states it is investing in safety resources. The company's new policy takes effect November 25. Other tech firms are also facing scrutiny over AI safety for minors.


With a significance score of 5, this news ranks in the top 1.7% of today's 33277 analyzed articles.

Get summaries of news with significance over 5.5 (usually ~10 stories per week). Read by 10,000+ subscribers:


Character.AI bans users under 18 following mother's lawsuit over son's suicide | News Minimalist