Logo
Logo
Search
UPGRADE
ABOUT
RESOURCES
REPORTING
LOGIN

San Francisco's city attorney sued the operators of 22 AI nudifiers. He wants others to step up, too.

David Chiu, San Francisco's top lawyer, speaks to Indicator

Alexios Mantzarlis
Alexios Mantzarlis

Feb 11, 2026

San Francisco's city attorney sued the operators of 22 AI nudifiers. He wants others to step up, too.

In August 2024, the San Francisco City Attorney sued the operators of 16 websites dedicated to generating nonconsensual deepfake nudes. The legal action made David Chiu, the city’s chief lawyer, a key player in the global battle against so-called “AI nudifiers.”

That battle is as urgent as ever. Several countries have passed bills that criminalize the dissemination of synthetic nonconsensual intimate images (SYNCII); but nudifiers continue to profit from image-based sexual abuse.

In a recent survey commissioned by UNICEF, 1.2 million children across 11 countries said they had been targeted by a nonconsensual deepfake nude. “In some countries, this represents 1 in 25 children – the equivalent of one child in a typical classroom,” the UN organization noted.

The situation is so dire that on Tuesday, a group of internet safety organizations called for a complete ban on AI nudifiers, saying that “this functionality serves no good purpose.” Cassie Coccaro of Thorn, one of the organizations that signed the letter, wrote on LinkedIn that “it’s not common for Thorn to call for an all-out ban on any given technology” because such a drastic decision can have complex trade-offs. Not in this case.

San Francisco’s lawsuit has resulted in real change, including two settlements and 11 websites shut down in California. But Chiu is realistic about the extent of his powers, telling Indicator that “we need more people to take responsibility for addressing this and stamping it out.”

In a phone conversation, Chiu offered an update on the lawsuit, spoke about what his office has discovered about the finances and organization of AI nudifiers, and the need for technology companies that are “adjacent to and facilitating this trade” to step up their efforts.

What follows is a lightly edited transcript of our conversation.

City Attorney Chiu, thanks for being willing to talk to me. 

I wanted to get started by giving Indicator readers a sense of the state of your lawsuit. Could you walk me through the number of plaintiffs sued, served, those who have settled, and those who are currently in active litigation? 

Eight defendants have been served and two – Briver LLC and Richard Tang – have settled. We're expecting default judgments against at least three others: Itai Tech, Defirex, and CodeBionic Labs. We're still trying to serve one defendant, Gaofan Xu.

But big picture, 11 of the websites that we initially went after have been shut down thus far, at least in California.

Why did you feel that the City of San Francisco was the right authority to pursue this legal case – compared to individual victims or the state?

The origin of this lawsuit was that a couple of my top litigators – who happen to be mothers of teenage daughters – read an article about teenage girls being victimized by AI-generated non-consensual pornography. They texted me the article that weekend and said “Would you approve of us investigating this?” and I said, “Of course.”

What we learned with that one article was horrifying. Everything else we've learned since that has been even more horrifying. It was an investigation that took us to the darkest corners of the internet. We were horrified for the women and girls who've endured this exploitation. 

We were also incredibly surprised that no one else had ever gone after these websites. I would have thought that someone else would have done it before, but we felt strongly that someone needed to act, and that was why we filed.

You talked about victims. We understand there have been some victims who have attempted to get recourse. But victims have found virtually no legal recourse, in part because it's almost impossible to determine what website generated the images. And once images are distributed, no one can control what happens to them or remove those copies off the web. You can't unring the bell. 

The impact on victims and survivors is incredibly negative. The impact on one's reputation, mental health, loss of autonomy… We know that these images are often used to bully, humiliate, and threaten women and girls. The FBI has warned of an uptick in extortion schemes using this pornography and this phenomenon has impacted a shocking number of women and girls around the globe. 

So we felt compelled to act. 

Yeah, surveys talk about 6% of US teens targeted, and similar proportions in other countries. 

One challenge in this space is that many operators of these awful websites conceal their ownership. Your initial lawsuit mentions several defendants by name but also several unnamed “Doe” defendants. Do you hope to get meaningful relief from all of them? And how are you prioritizing which to spend most time on?

logo

Join today to read the rest

"Alexios and Craig have built something exceptional with The Indicator." - Ruben Gomez, Researcher and former Trust & Safety worker at Reddit and Twitter/X

Upgrade

With a membership, you get:

  • Everything we publish
  • All of our workshops
  • Our eternal gratitude

Keep Reading



Indicator is your essential guide to understanding and investigating digital deception.

cursor-click