In collaboration with Wired, Indicator is launching a map of known cases of AI nudifiers being used in schools around the world. The map currently links to 88 instances where AI tools were used to virtually “undress” girls and young women. The incidents span 28 countries and affected more than 640 children. The real number is much larger, since many of the articles we mapped don’t cite the number of students affected.
You can view the map and all of our related coverage on nonconsensual synthetic nudes here.
The map offers a sobering and concerning look at the spread of nonconsensual deepfakes among minors and young adults. It’s also the tip of the iceberg. UNICEF estimated that as many as 1.2 million children in 11 countries have been victimized by nonconsensual sexual deepfakes.
Parents of one victim told USA Today that their teenage daughter was “once social and bubbly” but " … has become reclusive … afraid to go to the grocery store or pharmacy out of fear that other patrons have seen the deepfake images.”
These stories are collected in the map, which focuses on incidents related to schools. We chose this subset because it allows for easier comparison across cases and shows how careless deployment of AI technology has armed underaged students with a potent tool of abuse. Explicit deepfakes have also targeted local officials, researchers, university students, teachers, and other under-aged victims.
The map helps show the issues and challenges that cut across school-related incidents, such as boys who are seemingly unaware of the harm they can cause to others, how school administrators and law enforcement struggle to respond, and the toll such incidents take on victims.
The data can help inform approaches for prevention and mitigation. The map also highlights the resilience and courage of teenaged girls who choose to fight back against technology-assisted gender-based violence.
Indicator has extensively covered the platform dynamics and economic incentives of these tools of abuse. In spite of platform promises to crack down on nudifier apps and tools, this remains an unresolved problem. As recently as this week, Meta ran ads for xjoy[.]ai, a tool we previously flagged on multiple occasions The ads say that the tool can be used to turn “one picture of a beautiful woman” into a nude video.

Indicator will keep updating the map. If you’re aware of any cases we missed, please email me at [email protected].
And for any victims of image-based sexual abuse reading this article, may the words of one German student ring true: "No matter how difficult it is, speak up, please go to the police and file a report. I want to tell every victim: Hey, you're not alone! Have the courage to do something and say that what they did wasn't right, that I don't want to be treated like that. It's a crime!"

