On June 11, Meta announced new measures to prevent AI nudifiers from advertising on its platforms. Prior to that, Indicator’s sustained reporting had identified more than ten thousand Meta ads promoting nudifier tools, which allow people to upload an image of a person and remove their clothing or place them in a sexually explicit video.
In June, Meta said it had stepped up its enforcement efforts and “developed new technology specifically designed to identify these types of ads.”
But since the company announced its countermeasures, it has allowed thousands of new AI nudifier ads to run on Instagram and Facebook, according to new reporting from the American Sunlight Project (ASP) and Indicator.
We also found that Meta’s implementation of European Union transparency requirements has allowed the companies behind the ads to hide behind gibberish aliases like “ddd,” “Potozmgz,” and “111.”
The EU’s Digital Services Act requires major online platforms to display the entity that paid for and benefits from an ad that was shown to users in the region. Meta’s system appears to allow advertisers to easily use fake or nonsense names, thereby concealing their identity and effectively defeating the DSA’s transparency requirements.
Our analysis found that the disclosures for Meta ads shown in Singapore and Taiwan were more accurate, potentially offering a model for the EU and other jurisdictions to follow.
Do you like paying less?
Read the rest of this post by joining Indicator today with a 20% limited time discount.
UpgradeA membership includes:
- Everything we publish, plus archival content, including the Academic Library and our Resources
- Live monthly workshops and access to all recordings
- The good vibes that come from supporting independent journalism