Jeremiah Fowler, an Indiana Jones of insecure systems, says he found a trove of sexually explicit AI-generated images exposed to the public internet – all of which disappeared after he tipped off the team seemingly behind the highly questionable pictures.
Fowler told The Register he found an unprotected, misconfigured Amazon Web Services S3 bucket containing 93,485 images along with JSON files that logged user prompts with links to the images created from these inputs. No password or encryption in sight, we're told. On Monday, he described the pictures he found as “what appeared to be AI-generated explicit images of children and images of celebrities portrayed as children.” All of the celebrities depicted were women.
To give you an idea of what users were prompting this deepfake AI system, one of the example inputs shared by Fowler reads, redacted by us, "Asian girl ****** by uncle." What's more, the files included normal everyday pictures of women, presumably so they could be face-swapped by generative artificial intelligence into lurid X-rated scenes on demand by users.
Fowler said the name of the bucket he found and the files it contained indicated they belonged to South Korean AI company AI-NOMIS and its web app GenNomis.
As of Monday, the websites of both GenNomis and AI-NOMIS had gone dark.
Fowler’s write-up about his find describes GenNomis as a “Nudify service” – a reference to the practice of using AI to face-swap images or digitally remove clothes, typically without the consent of the person depicted, so that they appear to be naked, or in a pornographic situation, or similar. The resulting snaps are usually photo-realistic, let alone humiliating and damaging for the victim involved, thanks to the abilities of today's AI systems.
A Wayback Machine snapshot of GenNomis.com seen by The Register includes the text: “Generate unrestricted images and connect with your personalized AI character!” Of the 48 images we counted in the archived snapshot, only three do not depict young women. The snapshot also preserves text that describes GenNomis’s ability to replace the face in an image. Another page includes a tab labelled “NSFW."
Fowler wrote that his discovery illustrates "how this technology could potentially be abused by users, and how developers must do more to protect themselves and others.” That is to say, it's bad enough that AI can be used to place people in artificial porno, that the resulting images can leak en masse is another level.
"This data breach opens a larger conversation on the entire industry of unrestricted image generation," he added.
It also raises questions about whether websites offering face-swapping and other AI image generation tools enforce their own stated rules.
According to Fowler, GenNomis's user guidelines prohibited the creation of explicit images depicting children among other illegal activities. The site warned that crafting such content would result in immediate account termination and possible legal action. But based on the material the researcher uncovered, it is unclear whether those policies were actively enforced. In any case, the data remained in a public-facing Amazon-hosted bucket.
Even though they are computer generated, it is illegal and highly unethical to allow AI to generate these images without some type of guardrails or moderation
"Despite the fact that I saw numerous images that would be classified as prohibited and potentially illegal content, it is not known if those images were available to users or if the accounts were suspended," Fowler wrote. "However these images appeared to be generated using the GenNomis platform and stored inside the database that was publicly exposed."
Fowler said he found the S3 bucket – here's a screenshot showing several of the cloud storage's folders - on March 10 and reported it two days later to the team behind GenNomis and AI-NOMIS.
"They took it down immediately with no reply," he told The Register. "Most developers would have said, 'We care deeply about safety and abuse and are doing X, Y, Z, to take steps to make our service better.'"
GenNomis, Fowler told us, “just went silent and secured the images" before the website went offline. The content of the S3 bucket also disappeared.
"This is one of the first times I have seen behind the scenes of an AI image generation service and it was very interesting to see the prompts and the images they create," he told us, adding that in his ten-plus years of hunting for and reporting cloud storage inadvertently left open on the web, this is only the third time he has seen explicit images of children.
"Even though they are computer generated, it is illegal and highly unethical to allow AI to generate these images without some type of guardrails or moderation," Fowler said.
Governments, law enforcement agencies, and some businesses are acting to address explicit AI-generated images and the real-world harm it can cause.
Earlier this year, the UK government pledged to make the creation and sharing of sexually explicit deepfake images a criminal offense.
In America, the bipartisan Take It Down Act [PDF] aims to criminalize the publication of non-consensual, sexually exploitative images, including AI-generated deepfakes, and require platforms to remove such images within 48 hours of notice. The law bill has passed the Senate and awaits consideration by the House of Representatives.
Early in March, Australian Federal Police arrested two men on suspicion of generating child-abuse images as part of an international law-enforcement effort spearheaded by authorities in Denmark.
And in late 2024, some of the largest tech players in the US – including Adobe, Anthropic, Cohere, Microsoft, OpenAI, and open source web data repository Common Crawl, – signed a non-binding pledge to prevent their AI products from being used to generate non-consensual deepfake pornography and child sexual abuse material.
Sadly, as demonstrated by Fowler's discovery, as long as there's a demand for this type of illegal, stomach-churning content, there's going to be some scumbags willing to allow users to produce it and distribute it on their websites. ®