Minnesota considers blocking 'nudify' apps that use AI to make explicit images without consent

ST. PAUL, Minn. (AP) — Molly Kelley was stunned to discover in June that someone she knew had used widely available “nudification” technology to create highly realistic and sexually explicit videos and images of her, using family photos that were posted on social media.

“My initial shock turned to horror when I learned that the same person targeted about 80, 85 other women, most of whom live in Minnesota, some of whom I know personally, and all of them had connections in some way to the offender,” Kelley said.

Backed by her testimony, Minnesota is considering a new strategy for cracking down on deepfake pornography. A bill that has bipartisan support would target companies that run websites and apps allowing people to upload a photo that then would be transformed into explicit images or videos.

States across the country and Congress are considering strategies for regulating artificial intelligence. Most have banned the dissemination of sexually explicit deepfakes or revenge porn whether they were produced with AI or not. The idea behind the Minnesota legislation is to prevent the material from ever being created — before it spreads online.

Experts on AI law caution the proposal might be unconstitutional on free speech grounds.

Why advocates say the bill is needed

The lead author, Democratic Sen. Erin Maye Quade, said additional restrictions are necessary because AI technology has advanced so rapidly. Her bill would require the operators of “nudification” sites and apps to turn them off to people in Minnesota or face civil penalties up to $500,000 “for each unlawful access, download, or use.” Developers would need to figure out how to turn off the function for Minnesota users.

It’s not just the dissemination that’s harmful to victims, she said. It’s the fact that these images exist at all.

Kelley told reporters last month that anyone can quickly create “hyper-realistic nude images or pornographic video” in minutes.

Most law enforcement attention so far has been focused on distribution and possession.

Congress, states and cities are also trying other tactics

San Francisco in August filed a first-of-its-kind lawsuit against several widely visited “nudification” websites, alleging they broke state laws against fraudulent business practices, nonconsensual pornography and the sexual abuse of children. That case remains pending.

The U.S. Senate last month unanimously approved a bill by Democrat Amy Klobuchar, of Minnesota, and Republican Ted Cruz, of Texas, to make it a federal crime to publish nonconsensual sexual imagery, including AI-generated deepfakes. Social media platforms would be required to remove them within 48 hours of notice from a victim. Melania Trump on Monday used her first solo appearance since becoming first lady again to urge passage by the Republican-controlled House, where it's pending.

The Kansas House last month approved a bill that expands the definition of illegal sexual exploitation of a child to include possession of images generated with AI if they're “indistinguishable from a real child, morphed from a real child’s image or generated without any actual child involvement.”

A bill introduced in the Florida Legislature creates a new felony for people who use technology such as AI to generate nude images and criminalizes possession of child sexual abuse images generated with it. Broadly similar bills have also been introduced in Illinois, Montana, New Jersey, New York, North Dakota, Oregon, Rhode Island, South Carolina and Texas, according to an Associated Press analysis using the bill-tracking software Plural.

Maye Quade said she'll be sharing her proposal with legislators in other states because few are aware the technology is so readily accessible.

“If we can’t get Congress to act, then we can maybe get as many states as possible to take action,” Maye Quade said.

Victims tell their stories

Sandi Johnson, senior legislative policy counsel for the victim’s rights group RAINN — the Rape, Abuse and Incest National Network — said the Minnesota bill would hold websites accountable.

“Once the images are created, they can be posted anonymously, or rapidly widely disseminated, and become nearly impossible to remove,” she testified recently.

Megan Hurley also was horrified to learn someone had generated explicit images and video of her using a “nudification” site. She said she feels especially humiliated because she's a massage therapist, a profession that's already sexualized in some minds.

“It is far too easy for one person to use their phone or computer and create convincing, synthetic, intimate imagery of you, your family, and friends, your children, your grandchildren,” Hurley said. “I do not understand why this technology exists and I find it abhorrent there are companies out there making money in this manner.”

AI experts urge caution

However, two AI law experts — Wayne Unger of the Quinnipiac University School of Law and Riana Pfefferkorn of Stanford University's Institute for Human-Centered Artificial Intelligence — said the Minnesota bill is too broadly constructed to survive a court challenge.

Limiting the scope only to images of real children might help it withstand a First Amendment challenge since those are generally not protected, Pfefferkorn said. But she said it would still potentially conflict with a federal law that says you can't sue websites for content that users generate.

“If Minnesota wants to go down this direction, they'll need to add a lot more clarity to the bill,” Unger said. “And they'll have to narrow what they mean by nudify and nudification.”

But Maye Quade said she thinks her legislation is on solid constitutional ground because it's regulating conduct, not speech.

“This cannot continue," she said. "These tech companies cannot keep unleashing this technology into the world with no consequences. It is harmful by its very nature.”

___

Associated Press reporters Matt O'Brien, John Hanna and Kate Payne contributed to this story from Providence, Rhode Island; Wichita, Kansas; and Tallahassee, Florida, respectively.

___

This story has been corrected to show the spelling of Molly Kelley's last name is Kelley, not Kelly.

03/04/2025 16:08 -0500

News, Photo and Web Search