The San Francisco City Attorney’s office is suing 16 of the most frequently visited AI-powered “undressing” websites, often used to create nude deepfakes of women and girls without their consent. The landmark lawsuit, announced at a press conference by City Attorney David Chiu, says that the targeted websites were collectively visited over 200 million times in the first six months of 2024 alone.
The offending websites allow users to upload images of real, fully clothed people, which are then digitally “undressed” with AI tools that simulate nudity. One of these websites, which wasn’t identified within the complaint, reportedly advertises: “Imagine wasting time taking her out on dates, when you can just use [the redacted website] to get her nudes.”
The website operators are accused of violating state and federal laws banning revenge pornography, deepfake pornography, and child pornography, alongside California’s unfair competition law because “the harm they cause to consumers greatly outweighs any benefits associated with those practices,” according to the complaint filing. The lawsuit is seeking civil penalties, in addition to taking the websites offline and permanently preventing their purveyors from creating future deepfake pornography.
The complaint comes amid heightened attention on the creation and spread of non-consensual nudes, largely driven by advancements in generative AI — leading to increasing reports of “sextortion.” Notable celebrities like Taylor Swift have been victimized by sexually explicit deepfakes, and schoolchildren across the country have been expelled or arrested for circulating AI-generated nude photos of their classmates.
“This investigation has taken us to the darkest corners of the internet, and I am absolutely horrified for the women and girls who have had to endure this exploitation,” Chiu said on X. “This is a big, multi-faceted problem that we, as a society, need to solve as soon as possible.”