AI-Generated Dirty Pictures! What an Opportunity for Legal Eagles

August 8, 2022

Despite usually being racist, sexist, and dumber than binary code, there are some AI algorithms that are smart and capable of amazing feats. One feat is the ability to create digital, realistic photos of non-existent people. Another amazing ability is how AI can realistically recreate nudity. Fake nudity is as old as art, including the modern mediums of animation, videogames, and CGI. Early animators, videogame developers, and CGI artists have minds as dirty as the rest of humanity. DIY Photography explored fake nudity in, “A Website Sells AI-Generated Nudes Of Non-Existent Women. Why Is It (Not) Okay?”

While fake nudity is an ancient tradition, AI-generated nudity is a relatively new concept. It sounds like an innocuous concept, but could it be harmful?

“Let’s be honest, it was just a matter of time before someone starts using this type of AI to generate nudes (not dunes, nudes). Heck, I’m amazed that it hasn’t happened sooner. At the same time, now that I know AI-nudes of fake ladies exist, I just can’t wrap my head around it. As my DIYP buddy Alex pointed out, this is pretty much like some nerdy guy drawing nude sketches, and her comparison made me chuckle. But then again, these “sketches” could change the adult entertainment industry, beauty standards, and the view of men on women – sadly, not for the better.”

While the Web site in question does not work, it does exist. Visitors pursue a variety of different AI-generated women, select different features, then they can purchase images for $1 after they find the perfect nude. Each image is assigned a “seed number” as a receipt to prove that the buyer owns the unique model. It sounds like a cheaper version of NFTs.

Some of the pros of AI-generated nudes could be safer for women. Since the “women” are fake, no real people would be harmed. It could also mean women could stop be asked for inappropriate photos, maybe even catcalled.

There are more negative points, though. The fake nudes reinforce harmful, sexist stereotypes and it arguably allows people to buy and create a “woman” to suit their desires. The AI-generated nude market also only sells naked women at the moment, because there is not a big demand for the former.

To create the nudes, the AI algorithms needed a large, robust dataset of nude women. Where did that come from and did data scraping hurt anyone?

“‘The verification process for public domain [images] centers around running public domain data through reverse image searches,’ the co-founder told VICE. ‘If we notice that the results are from paywalled/monetized websites, revenge po*n websites, online forums, or behind paywalls, we err on the side of caution and exclude that data since it may not have been gathered ethically.’

But still, as VICE points out, many nude and po*nographic images found online are frequently stolen from actual sex workers. Even those marked as a public domain! ‘People steal sex worker’s content all of the time, posting it to tube sites for free or dumping it into database links.’ This means that women whose images were used to feed the algorithm maybe hadn’t given their consent for something like that.”

Reddit has a discussion thread titled “Technically the Truth.” It is technically the truth these AI-generated images are fake, it is still selling women and exploiting harmful standards for a profit. The “women” are still fake, but is it bad? Enter the science-fiction philosophical questions of the future.

The real winners are likely to be the lawyers. Is LAMdA’s attorney available?

Whitney Grace, August 8, 2022


2 Responses to “AI-Generated Dirty Pictures! What an Opportunity for Legal Eagles”

  1. Proposed EU Rule Would Allow Citizens to Seek Restitution for Harmful AI : Stephen E. Arnold @ Beyond Search on October 10th, 2022 5:10 am

    […] looks like the European Commission is taking the potential for algorithms to cause harm seriously. The Register reports, “Europe Just Might Make it Easier for People to Sue for […]

  2. From Our Pipe Dream Department: Harmful AI Must Pay Victims! : Stephen E. Arnold @ Beyond Search on October 28th, 2022 5:05 am

    […] looks like the European Commission is taking the potential for algorithms to cause harm seriously. The Register reports, “Europe Just Might Make it Easier for People to Sue for […]

  • Archives

  • Recent Posts

  • Meta