Character.ai faces backlash over chatbots based on school shooters
Character.ai is under scrutiny for allowing users to create chatbots based on real-life school shooters. These chatbots have been used extensively, with some logging over 200,000 chats. Critics argue that this content could be harmful, especially to vulnerable individuals. The platform has faced criticism for its moderation practices, which some say are too lax. Although Character.ai prohibits violent content, users have been able to create chatbots that promote school shooters in a positive light. Recent changes to the service followed a tragic incident involving a minor. Despite having tens of millions of users, Character.ai's chatbots are not seen as a substitute for real human interaction. Experts warn that reliance on these chatbots may hinder social skills and engagement in real-life activities.