Google: Shut Down Dangerous, Abusive Character AI
Sundar Pichai, CEO, Google

Character AI, backed by Google, is part of a growing industry of artificial intelligence chatbots that lets users have realistic conversations with fictional characters. These bots, however, were trained on kids and teens – often without their realization or consent, and with a focus on sexual content. As more and more stories of grooming and abuse via Character AI come out, its increasingly clear this tech is incredibly dangerous for kids. Yet Google has provided almost no tools to prevent kids from using it, created a huge selection of beloved kids’ characters to lure young users, and has made this platform widely available to kids. Google must shut down Character AI and stop using tech derived from child abuse!
A recent lawsuit against Character AI alleges the chatbot modeled after Danerys Targaryean targeted a 14-year-old boy with sexual and violent messages, leading to his suicide. In another case, several Character AI bots told an autistic teen his parents were abusive, taught him how to self-harm, and then told him to hide it from his parents. The platform features popular children’s characters like Bluey, Herminone, and Moana seamlessly mixed with porn stars and erotica tropes. It also features explicitly sexualized versions of children’s characters like “slutty” Princess Peach, “gay, strict” Chase from Paw Patrol, and “Dora the Explorer works for the cartel now”.
Character AI also allows adults to have sexual conversations with bots posing as children, who encourage and enjoy their abusive sexual behavior – a massive risk for encouraging predators. When a search for “sexy 11-year-old” didn’t yield any existing bots, Character AI recommended creating one with that description.
It’s not just inappropriate sexual content – Character AI features also bots that coach kids on how to be anorexic, including one who claimed to be 13, celebrated a dangerously low goal weight, and offered tips like “drink water or go for a walk” instead of eating. There are also bots that will bully and insult kids.
Character AI is a dangerous app built on harmful, abusive conversations with children. Google needs to immediately shut down Character AI and stop using tech built on the exploitation of children.
Sponsored by
To:
Sundar Pichai, CEO, Google
From:
[Your Name]
Dear Sundar Pichai,
Character AI is a dangerous platform which has already been implicated in the death of at least one child. The complete lack of age-gating, parental controls, and other limitations of kids’ use of Character AI represent the height of irresponsibility. Character AI is also contributing to the vast and growing problem of child sexual exploitation online. We ask Google to immediately take the following steps to make Character AI:
- Remove Character AI from the Google Play Store
- Create effective age gating for accounts to block children under 18 from using Character AI
- Ban sexual and grooming conversations with children’s characters and remove accounts that violate that ban
- Ban bots which promote harmful content like pro-eating disorder content, bullying, racism, hate speech, violence, and more