After Molly Kelley discovered her longtime friend uploaded photos from her social media to a website that generated sexually explicit “deepfakes” of her, she stopped going to work in person.
Kelley said she struggled to go out in public and suffered from paranoia that those around her would see the pornographic videos, which were made without her consent.
“Sitting in a meeting with clients or coworkers, I was deeply paranoid that they’d seen it and they’re wondering if it’s me,” Kelley said. “It’s the sword of Damocles over your head.”
Megan Hurley, whose images were used by the same man, said she missed two months of work. Hurley, a massage therapist, was no longer comfortable being at her workplace alone.
Kelley and Hurley, who both knew the man for years, said they were among about 85 women whose likenesses were used to create the pornography. Deepfake images or videos are realistic but fabricated representations of a person that can be difficult to distinguish from actual photos or footage.
But those of them who went to police, Kelley said, received little help. She called the response from law enforcement agencies “inconsistent” and “disheartening.” Kelley said she was instructed by police to politely ask the man to stop.
Now Kelley and Hurley are asking Minnesota lawmakers to pass legislation aimed at the websites that allow users to generate explicit images.
They are supporting legislation offered by Minnesota Sen. Erin Maye Quade, DFL-Apple Valley, that would prohibit website operators from allowing users to “nudify” images, altering them to show an artificial representation of a person’s intimate body parts not shown in the original photo.