MN lawmakers close to banning AI apps that turn photos into fake porn
Dozens of Minnesota women have had their innocent social media photos turned into pornography that could haunt them for the rest of their lives. Minnesota lawmakers call it “nudification,” and they’re getting close to a bipartisan ban on the technology, which they say could be as simple as flipping a switch. FOX 9’s Corin Hoggard has the story.
ST. PAUL, Minn. (FOX 9) – Dozens of Minnesota women have had their innocent social media photos turned into pornography that could haunt them for the rest of their lives.
Minnesota lawmakers call it “nudification,” and they’re getting close to a bipartisan ban on the technology, which they say could be as simple as flipping a switch.
From safe to salacious
The backstory:
On a laptop filled with images of naked women, Megan Hurley saw a realistic photo like none she’s ever taken.
“This looked indistinguishable, it looks like I was fully nude in a bathtub,” she said.
Hurley and about 80 other women found out a guy who was supposed to be a friend had used artificial intelligence to nudify their innocuous photos.
“It’s like a shadow version of yourself out there, doing things that never happened,” said Molly Kelley, of Otsego.
How does it work?
What we know:
FOX 9 grabbed an innocent photo of a reporter from Facebook, gave simple instructions to some free AI apps, and came up with a shirtless photo and a Fabio-like video.
It took less than a minute for a convincing fake.
“There’s no special skill required besides clicking,” Hurley said.
And some apps will let you make photos and videos that are a lot more graphic.
What they’re saying:
“It was so much worse than I had my imagination had come up with,” said Jessica Guistolise, of Minneapolis.
Discovering her nudified photos made Guistolise physically ill.
It took weeks for her to feel okay in a social or work setting again.
Some scientific studies show the effects on the brain are the same as if she was physically attacked sexually.
It’s still on her mind two years later every time she speaks publicly. And it doesn’t help to know the technology is only getting better and easier to access.
“This is widely available to happen to anyone,” she said. “Anyone is a potential victim, and then it does follow you for the rest of your life.”
Strategy to stop it
Why you should care:
Last year, the Minnesota Senate passed a bill to ban nudification, potentially fining the apps generating the content, and letting the victims sue.
It’s headed for a vote in the House after some slight changes to protect potentially legitimate uses on apps like Photoshop.
“We want to make sure we’re adjusting, obviously, for technical skill that can be added and respect art and the First Amendment,” said Rep. Jessica Hanson, (DFL-Burnsville).
“It’s the instantaneous, without human involvement, that should be banned,” Kelley said.
‘Slam dunk issue’
What they’re saying:
And if seeing it happen to more than 80 women in Minnesota isn’t convincing enough, the victims and lawmakers point to dozens of schoolgirls in Pennsylvania and infants sexualized by Grok.
“The consequences are so devastating that we’re saying that this technology should not be accessible to any Minnesotan at all,” said Sen. Erin Maye Quade, (DFL-Apple Valley).
“This is a slam dunk issue,” Kelley said. “Who wants this?”