Women and girls targeted by nonconsensual deepfake nudes have been seeking justice through a range of channels, all imperfect. They have requested takedowns from platforms, worked with elected officials on new legislation, and sued in court.

On the third front, lawsuits have to date largely focused on the people who created and shared AI nudes. One exception is the San Francisco City Attorney’s landmark case against 16 websites that enable people to create nonconsensual deepfake nudes.

Now, a 17-year-old from New Jersey has filed a lawsuit against an AI tool that she says was used by a classmate to “undress” her in 2023. The teenager’s suit alleges that Clothoff transformed an Instagram photo of her at the beach into a synthetic nude.

The complaint names AI/Robotics Venture Strategy 3 Ltd. as the British Virgin Islands-registered company behind ClothOff. It also names Alaiksandr Babichau and Dasha Babicheva of Minsk, Belarus as the people who developed, maintain, and oversee the “operations of ClothOff and its affiliated websites and services.” The siblings were linked to ClothOff as part of an investigation by The Guardian.

The biggest obstacle to bringing ClothOff to justice is likely to be the location of the defendants, which makes them difficult to serve, let alone hold accountable. But if the case proceeds, it could potentially bring justice to the website’s victims and unearth valuable information on the web of players that make up the nudifier economy. The complaint alleges that several nudifiers leverage ClothOff’s API, for instance, which would be a big revelation if confirmed in discovery.

ClothOff said in a statement to Fox News that it is “technically impossible” to use its service to create child sexual abuse material, or CSAM, and that they “block any attempt to generate videos or images involving minors or non-consenting individuals.“ They also claim to have “more serious” safety measures than most social networks, and compared the pushback they are receiving to the reaction Photoshop got when it first launched.

The co-counsels on the case are Shane Vogt, a Florida-based trial lawyer, and John Langford of the Yale Media Freedom & Information Access Clinic.

I spoke to Vogt, who it turns out is an Indicator member, on Friday Oct. 24 via Google Meet. 

He said he’s confident that there are grounds to pursue ClothOff based on existing law, including on product liability. He told me the case is bigger than his client — who he called a “phenomenal person” — because it’s also about shutting down an ecosystem of AI tools that are victimizing thousands of people around the world. 

What follows is an abridged and lightly edited transcript of our conversation.

How did you first come across the topic of synthetic non-consensual intimate imagery? Was this your first exposure to it or have you had other cases in this space?

Representing this particular client was the first time I'd come across this material and even knew these apps existed. And then obviously, I saw the stories and everything about it happening in a number of places.

I started doing my research into potential claims and saw that there were some pretty viable claims under already existing law. I found it interesting that a lot of the initial responses that you saw from state law enforcement and others was that the current laws don't cover it. My research showed that it did. Once I really dug in and determined that there were viable claims under existing statutes, particularly federal laws, I went ahead and found a way to start helping the client out. 

The first lawsuit that we filed involved actual users of the sites. And the thought process on that was that initially the client just wanted to find out, you know, what happened and what images were used. And then unfortunately you get into a situation – which as you can see from the filings in the prior case, I'm somewhat limited in talking about because of confidentiality protections – where because we are asserting claims including under federal child pornography statutes, people invoke their Fifth Amendment rights. And so at that point it becomes extremely difficult to get information out of anybody. 

This case is in Jersey, you’re in Florida. I saw there's a pro bono clinic involved too. Could you give me a high-level sense of the legal team assembled?

Yeah. So, as I was working on the initial case that I'm involved in in New Jersey against the classmate, the Yale Media Freedom and Information Access Clinic contacted me about the possibility of getting involved in a case together. John Langford up there is amazing, as well as their staff attorneys, and the students involved in the clinic have been just phenomenal. 

It's actually pretty amazing when you stop and think about us collaborating on this because broadly speaking, I more often am on the other side of groups like that. I handle plaintiff and defense work in speech and privacy cases, but a lot of times I'm adverse to media companies in some of these bigger cases. [Editor’s note: Vogt represented Hulk Hogan in Bollea v Gawker.]

In fact, I have a case where the other side includes one of the lawyers who's involved with this Clinic. And you know, we made sure that that was fine with everyone before we started, and it was. But to me, that's just an interesting aspect … that this is such a significant problem that the likes of us have joined up together to try and fight it.

Because to me the most troubling aspect of this is that the victims, in large part, are kind of being left to handle it themselves. It doesn't seem like law enforcement is taking an aggressive approach on this. I've seen a few, in doing my research, criminal prosecutions against people, but it tended to be, from what I could tell, cases involving very young child pornography victims.

logo

Upgrade to read the rest

Become a paying member of Indicator to access all of our content and our monthly members-only workshop. Support independent media while building your skills.

Upgrade

A membership gets you:

  • Everything we publish, plus archival content, including the Academic Library
  • Detailed resources like, "The Indicator Guide to connecting websites together using OSINT tools and methods"
  • Live monthly workshops and access to all recordings and transcripts

Keep Reading

No posts found