TLDR
- San Francisco City Attorney sues 16-18 websites for AI-generated nonconsensual nude images
- Targeted websites had over 200 million visits in first half of 2024
- Images can be of women, girls, and even children
- Lawsuit cites violations of laws on deepfake porn and child sexual abuse material
- Some cases involve middle and high school students
The San Francisco City Attorney’s office has filed a lawsuit against multiple websites offering AI-powered services to create nonconsensual nude images.
The legal action targets 16 to 18 websites that allow users to generate fake nude pictures of women and girls without their consent.
City Attorney David Chiu announced the lawsuit on August 15, 2024. The complaint alleges that these websites have violated California and U.S. laws related to deepfake pornography, revenge porn, and child sexual abuse material.
According to the lawsuit, the targeted websites collectively received over 200 million visits in just the first six months of 2024. This high traffic indicates the widespread use of these AI ‘undressing’ services.
The AI models used by these websites are reportedly trained on pornographic images and child sexual abuse material. Users can upload a picture of their target, and the AI generates a realistic, nude version of that person. Some sites claim to “see anyone naked” or suggest using their service to “get her nudes” instead of dating.
While some websites limit their services to adults, others allow the creation of images depicting minors. This has led to incidents involving underage students.
In February 2024, AI-generated nude images of 16 eighth-grade students were shared among classmates at a California middle school. A similar case occurred in Australia, where a teenager was arrested for circulating AI-generated nude images of high school students.
The lawsuit seeks to shut down these services and demands $2,500 for each violation. It also calls for domain registrars, web hosts, and payment processors to stop providing services to these websites.
City Attorney Chiu expressed horror at the exploitation enabled by these websites. He stated,
“We all need to do our part to crack down on bad actors using AI to exploit and abuse real people, including children.”
Chiu emphasized that while AI has “enormous promise,” its misuse for sexual exploitation is not innovation but abuse.
The complaint highlights the severe consequences for victims of this technology. Once these fake nude images are created and shared, victims have little recourse to remove them from the internet. This can lead to long-lasting psychological, emotional, and reputational harm.
The lawsuit is part of a growing effort to address the rapid spread of non-consensual intimate imagery (NCII) facilitated by AI technology. Governments and organizations worldwide are working to curtail this practice as it affects both celebrities and ordinary individuals.
The use of AI to generate child sexual abuse material is particularly concerning. It complicates efforts to identify and protect real victims. Some U.S. states, like Louisiana, have already passed laws specifically banning AI-generated child sexual abuse content.