Minnesota just banned the apps that make deepfake nudes
Favicon 
www.optimistdaily.com

Minnesota just banned the apps that make deepfake nudes

BY THE OPTIMIST DAILY EDITORIAL TEAM Two years ago, Molly Kelley found out that a close family friend had used a nudification website to make nonconsensual deepfake images of her and dozens of other women. About 80 women in Minnesota were affected by the same person. When she tried to figure out what legal recourse she had, she found almost none. The images were stored only on the man’s computer, so laws banning dissemination didn’t apply. There was no indication of intent to share them, which ruled out the state’s revenge porn statute. None of the women were minors, so possessing the images wasn’t a crime. No existing law allowed her to sue for restitution. So she started making phone calls. “This has taken every spare moment I have,” Kelley said. She educated lawmakers, gave testimony, and advocated for the better part of two years, all while raising two children, working full-time, and completing law school. The result of that work passed the Minnesota Senate 65-0, and last week, Governor Tim Walz signed House File 1606, making the decision final. What the law actually does Minnesota House File 1606 is the first law in the United States to ban nudification apps, the tools that allow anyone to upload a photograph of a clothed person and have it digitally transformed to appear nude. No technical skill required. Rather than targeting the people who misuse these tools, the bill targets the function of the technology itself. The law will allow survivors to sue app owners for damages. The state attorney general will also be empowered to collect fines of up to $500,000 per violation. The bill includes an exemption for general editing tools like Photoshop, where producing nonconsensual intimate imagery requires meaningful technical expertise. The target is software built specifically to automate the process. Why this was harder to address than it sounds Federal efforts to create similar protections have stalled repeatedly. The DEFIANCE Act, which would create a civil right of action for survivors of nonconsensual deepfakes, has twice passed the Senate but has not reached a House floor vote. Last year’s Take It Down Act made it a federal crime to share nonconsensual intimate images, but it does not allow survivors to sue the platforms or app makers. That gap matters because, as Kelley’s case illustrated, many incidents of this kind never involve images being shared at all. The harm begins at creation. “These images don’t exist without a third-party involvement and some sort of machine learning model,” she said. “I’ve dedicated the past two years of my life to finding a solution to mitigate the harm when it’s actually caused, which is at creation.” The scale of the problem has grown quickly. In December, X’s integrated chatbot Grok generated and posted over 1.8 million sexualized images of women in nine days after enabling free image generation, according to reporting from The New York Times and the Center for Countering Digital Hate. X had apparently made significant efforts to curb nonconsensual deepfakes, but users have continued to find ways around its guardrails. The independent media organization Indicator has tracked 23 cases of deepfake abuse targeting school communities across the United States since 2023. RAINN, which runs the national sexual assault hotline, has reported an increase in children calling about digital violence over the past five years. A Centers for Disease Control survey conducted from 2023 to 2024 found that one in 10 women had experienced tech-facilitated sexual abuse in the prior year; one in three said the same when asked about their lifetime. What comes next California and other states have moved to hold creators and platforms accountable for deepfakes, but Minnesota’s law is the first to target the underlying technology at the point of production. Kelley sees the issue in terms that go beyond any single piece of legislation. “Deep down, this is a manipulation and a control issue of women,” she said. The Trump administration has indicated support for federal preemption of state AI laws, which could void Minnesota’s bill if that policy is formalized. For now, advocates are waiting on the governor’s signature and watching to see whether other states follow Minnesota’s lead before federal policy settles the question.     Did this solution stand out? Share it with a friend or support our mission by becoming an Emissary.The post Minnesota just banned the apps that make deepfake nudes first appeared on The Optimist Daily: Making Solutions the News.