16 AI “undressing” websites sued for creating deepfaked nude images

0
5
US Senate introduces ground-breaking bill to setup legal framework for ethical AI development


What just happened? One of the most sinister trends to come from the advancement of AI image generation in recent years is the rise of websites and apps that can “undress” women and girls. Now, The San Francisco City Attorney’s office is suing 16 of these most-visited sites with the aim of shutting them down.

The suit was the idea of Yvonne Meré, chief deputy city attorney in San Francisco, who had read about boys using “nudification” apps to turn photos of their fully clothed female classmates into deepfake pornography. As the mother of a 16-year-old girl, Meré wanted to do something about the issue, so rallied her co-workers to craft a lawsuit aimed at shutting down 16 of the most popular unclothing websites, writes the New York Times.

The complaint, which has been published with the websites’ names redacted, states that the sites were collectively visited 200 million times during the first six months of 2024. One of these undressing sites advertises: “Imagine wasting time taking her out on dates, when you can just use [the redacted website] to get her nudes.”

City Attorney David Chiu said that the sites’ AI models have been trained using real pornography and images depicting child abuse to create the deepfakes. He added that once the images were circulating, it was almost impossible to tell which website had created them.

The suit argues that the sites violate state and federal revenge pornography laws, state and federal child pornography laws, and the California Unfair Competition Law.

“This investigation has taken us to the darkest corners of the internet, and I am absolutely horrified for the women and girls who have had to endure this exploitation,” Chiu said on X. “This is a big, multi-faceted problem that we, as a society, need to solve as soon as possible.”

The problem of using AI to create nude images of people without their consent goes back a long time – a deepfake bot on Telegram was found to have made over 100,000 faked naked photos of women based on social media images in 2020.

Recent advances in generative AI have exacerbated the deepfake issue, making the images appear even more realistic. The explicit Taylor Swift pictures that were shared online in January led to US lawmakers calling for action and Google banning ads for deepfake porn and undressing sites.

Earlier this month, a new bipartisan bill proposed holding entities accountable for producing non-consensual “digital replicas” of people. The Nurture Originals, Foster Art, and Keep Entertainment Safe Act of 2024 (NO FAKES Act) will hold individuals and companies liable for damages if they create, host, or share unconsented AI-generated audio or visual depictions of a person.





Source link

LEAVE A REPLY

Please enter your comment!
Please enter your name here