
Google is running ads for undresser services, the world is seeking fact checks about Imane Khelif, and AI blends academic results with "some people's" beliefs about sleep training.
This newsletter is a ~7 minute read and includes 44 links.
TOP STORIES
STILL MONETIZING
Google announced on July 31 new measures against non consensual intimate imagery (NCII). The most significant change is a new Search ranking penalty for sites that “have received a high volume of removals for fake explicit imagery.” This should reduce the reach of sites like MrDeepFakes, which got 60% of its traffic from organic search in June1 and has a symbiotic relationship with AI tools like DeepSwap that fuel the production of deepnudes.
This move is very welcome and I know the teams involved will have worked hard to get it over the finish line.2
There is much more the company can do, however. The sites that distribute NCII are only half of the problem. The “undresser” and “faceswap” websites that generate non consensual nudes are mostly unaffected by this update, because they don’t typically host any of the non consensual deepnudes that their users create.
And these sites are also getting millions of visits from Search:3

Source: SimilarWeb
It is these undressers that are at the center of Sabrina Javellana’s heart-wrenching story. In 2021, Javellana was one of the youngest elected officials in Florida’s history when she discovered a trove of deepnudes of her on 4chan. I urge you to read the New York Times article on how this abuse drove her away from public life.
Google is not yet ready to go after undressers in Search (a mistake, imo). But it has promised to ban all ads to deepfake porn generators.
Unfortunately, it is breaking that promise. Over the past week, I was able to trigger 15 unique ads while searching for nine different queries 4 related to AI undressing. Below are three examples (ads are labeled “sponsored”):

The ads point to ten different apps or websites. As is typical for this ecosystem of AI-powered grifting, these tools offer a variety of text- and image-generation services and it’s hard to tell the extent to which their dominant use cases is always NCII.
This is the case of justdone[.]ai, which appears to have run hundreds of Google Ads. Most of these were for legitimate use cases, but they included a couple advertising an “Undresser ai” (as well as two ads for AI-written obituaries, which have previously gotten Search into trouble).
And deepnudes are clearly a core service for several of the sites I found advertising with Google, including ptool[.]ai, which promises that “our online AI outfit generator lets you remove clothes” and mydreams[.]studio, which has run at least 15 Google ads.
MyDreams suggests female celebrity names like Margot Robbie as “popular tags” and gives users the option to generate images through the porn-only model URPM.

Google is not getting rich off of these websites, nor is it willfully turning a blind eye. These ads ran because of classifier and/or human mistakes and I suspect most will be removed soon after this newsletter is published. But undressers are a known phenomenon and their continued capacity to advertise on Google suggests that teams fighting this abuse vector should be getting more resources (maybe they could be reassigned from teams working on inexistent user needs).
And to be clear, Google is not alone. One of the websites above was running ads on its Telegram channel for FaceHub:AI, an app that is available on Apple’s App Store and promises you can “change the face of any sex videos to your friend, friend’s mother, step-sister or your teacher.” Last week, Context found four other such apps on the App Store advertising on Meta platforms.