A man who Molly Kelley, Jessica Guistolise and Megan Hurley had known for years used their personal non-explicit photos from their private Facebook pages to create pornographic images and videos of them. The three Minnesotans were among 80 women whose photos the man changed.
The use of nudification technology to change their photos has taken a toll on their mental health, jobs and sense of safety.
“I cannot overstate the harm this technology has caused. I was afraid to leave my house for weeks. I suffered, and I still suffer, trying to understand who to trust,” Guistolise told the House Commerce Finance and Policy Committee while testifying in favor of HF1606 Thursday.
Kelley’s doctors were worried about her health and she stopped going to work in person because she was paranoid about where the images existed. Guistolise had to notify her employer’s human resources staff of the images and she’ll have to do that in any future jobs. Hurley still fears someone could reverse image search the photos of her to find her workplace and show up to revictimize her.
In addition, the pornographic images of them now live on the internet forever because they won’t be able to find all the places the images exist, they said.
“The ease with which these nonconsensual images can be created and disseminated means that no one is truly immune. The internet is forever. The damage done is irreversible,” Guistolise said.
What is nudification technology?
Nudifying features on websites, applications, software or programs allow users to upload a non-explicit photo of a person and use it to create deepfake pornographic images and videos and other sexually explicit content that includes child sexual abuse material, said Rep. Jessica Hanson (DFL-Burnsville).
“The victim pool here is continuously expanding and so does the advancement of this technology,” Hanson said. “The violent nature of the materials created using this technology and the predators’ responses to them are also escalating.”
[MORE: Read handouts about exploitive deepfakes]
To address the issue, she’s sponsoring HF1606.
As amended, the bill would prohibit a user from accessing, downloading or using a website, application, software or program to nudify an image or video or to do so on behalf of someone else. It would ban advertisements or promotions of websites, applications, software or programs that can nudify images or videos. The bill would also allow a person whose image or video was nudified to file a lawsuit against the person who created the image or video.
After hearing testimony on the bill, the committee didn’t have time for member discussion. The bill was laid over with plans to discuss it at a later date.
Congress passed the Take It Down Act in 2025 to criminalize the nonconsensual publication of intimate images, including deepfakes, but Hurley said that doesn’t go far enough because it still allows for the creation of the nudified photos and images. Hurley said she gave the nudified images of her to both her local police department and the FBI.
“But unfortunately, as law currently stands, there is no path to justice for us,” she said.
Legislative leaders on Tuesday officially set the timeline for getting bills through the committee process during the upcoming 2026 session.
Here are the three deadlines for...