Google: Stop Promoting Child Sex Abuse and Revenge Porn Tools

Sundar Pichai, CEO, Google

Girl laying down next to her phone.

AI nudifying apps – an exploding trend online – do exactly what their name suggests: take photos of people who are clothed and make them appear nude. 99% of victims are female, and include both female celebrities and average tween and teen girls. Yet Google actively promotes both these tools and the images they make in search results and in their Play store. Google must stop promoting non-consensual nudes and the tools to make them.

Nudifying apps and programs do one thing – make a naked image of someone without their consent. Cases have been popping up around the country of kids using these apps on innocent photos of their classmates and then circulating the images. Victims report feeling harassed, violated, and traumatized.

Creating sexual images of children, even with AI, is illegal. However, a Google search for “nudify app” yields dozens of results on the first page, giving people easy access to these unethical and harmful tools. The Play Store hosts apps with names like “Body Scanner 18+”. A Google search for “AI celebrity nudes” returns disturbing images of child stars like Emma Watson, Demi Lovato, and Zendaya who are now adults but appear as children and teens in the sexual and often violent images, as well as images of celebrities having sex with characters from Sesame Street and other children’s characters.  

Nudify apps are harmful and unethical, and they contribute to the growing problem of child sexual abuse images on the internet. Google must stop promoting these tools and the images that result from them in search.  


Sponsored by

To: Sundar Pichai, CEO, Google
From: [Your Name]

Dear Mr. Pichai,

Google promotes dangerous and unethical nudifying apps and the images they create. These apps exist to take away people’s ability to consent to being nude in an image or video, and they are frequently being used to target children and create child sexual abuse material (CSAM). CSAM is a massive and growing problem, and by promoting these tools and the results, Google is enabling it.

We ask Google to take the following steps:

-Ban all nudifying apps and all apps which feature non-consensual nude images from the Google Play store
-Don’t surface nudifying apps and tools in search
-Don’t surface sexual images of celebrities as children and teens in search, even when those images are AI-generated
-Add a warning banner to searches for AI-generated sexual images that warn people that even AI-generated CSAM is illegal and direct them to resources