States race to restrict deepfake porn as it becomes easier to create

A gold set of the scales of justice

by Madyson Fitzgerald, Georgia Recorder [This article first appeared in the Georgia Recorder, republished with permission]


April 11, 2024

After a 2014 leak of hundreds of celebrities’ intimate photos, Uldouz Wallace learned that she was among the public figures whose images had been stolen and disseminated online.

Wallace, an actress, writer and social media influencer, found out the images were ones her ex had taken without her consent and had threatened to leak.

Over the next few years, Wallace spent loads of money paying private companies to take down the images, she said. It wasn’t until later that she found out that those same photos had been used to make fake pornographic images of her.

“It’s just ridiculous the amount of time that people have and how much they’re profiting from these kinds of things,” Wallace told Stateline. “For them to sit there and create so much fake content of someone that clearly doesn’t want anything of that sort? Without consent? It’s just crazy to me.”

Mortified, Wallace was reluctant to share her story — at first. But in 2022, she went public with it and now she heads a nonprofit organization, Foundation Ra, that supports people who have become victims of manipulated or artificial intelligence-generated sexual images.

“I thought, ‘At what point is somebody going to do something about this?’” she asked. “And that’s when I decided to share my story and try to change the law.”

As more people, including minors, become victims of deepfake pornography and the industry that’s growing out of it, state lawmakers are pursuing legislation to deter the unauthorized creation and dissemination of digitally altered images.

Deepfakes — digitally altered photos and videos that can make someone appear to be, or be doing, just about anything — have proliferated on the internet. Examples range from simple face swaps done using readily available software to a person grafting Tom Cruise’s face and voice onto their body for content on a TikTok account.

In 2023, the total number of deepfake videos online was 95,820, up 550% from 2019, according to a report by Home Security Heroes, a group that researches best practices for online security. Pornography made up 98% of them.

The issue made international headlines in January, when fabricated sexually explicit images of pop star Taylor Swift that had been created by a free AI generator went viral, prompting lawmakers in several states to introduce legislation to combat deepfake porn, including Missouri’s Taylor Swift Act.

Several years ago, special equipment was needed to make a deepfake video. That’s no longer true, said Marc Berkman, CEO of the Organization for Social Media Safety, a national nonprofit organization dedicated to social media safety.

“This is a clear public policy issue,” Berkman said. “This is a behavior that we recognize causes harm, does not conform to societal values, relies on new technology, and so there should be a public policy response.”

Adding to existing laws

IndianaTexas and Virginia in the past few years have enacted broad laws with penalties of up to a year in jail plus fines for anyone found guilty of sharing deepfake pornography. In Hawaii, the punishment is up to five years in prison.

Many states are combatting deepfake porn by adding to existing laws. Several, including Indiana, New York and Virginia, have enacted laws that add deepfakes to existing prohibitions on so-called revenge porn, or the posting of sexual images of a former partner without their consent. Georgia and Hawaii have targeted deepfake porn by updating their privacy laws.

Other states, such as FloridaSouth Dakota and Washington, have enacted laws that update the definition of child pornography to include deepfakes. Washington’s law, which was signed by Democratic Gov. Jay Inslee in March, makes it illegal to be in possession of a “fabricated depiction of an identifiable minor” engaging in a sexually explicit act — a crime punishable by up to a year in jail.

Washington state Sen. Tina Orwall, a Democrat, said that she and her colleagues wanted to act right away because it can be hard to keep up with this kind of technology.

“It [technology] just moves so fast,” she said. “Deepfakes and AI have been around, but now it seems like it’s accelerated. We’re just concerned about how we can protect people from the parts that are harmful.”

Deepfake pornography bills also are advancing in other states, including IllinoisMissouriNew Jersey and Ohio.

A Georgia legislative study committee discussed the potential for a state law, but no such bill advanced during the 2024 Legislature.

“States need to have their own laws that empower local law enforcement to be able to step in and act in these circumstances,” said Illinois Republican state Sen. Dan McConchie, who is sponsoring a bill that would prohibit the creation of deepfakes that feature minors engaged in sexual activity. “We can’t wait for an overtaxed federal judiciary to hopefully get around to it at some point.”

There are no federal laws banning deepfake porn, but several bills have been introduced in Congress, including the AI Labeling Act of 2023 and the DEFIANCE Act of 2024. Neither has moved out of committee.

High school victims

In 2023, sophomore students at Westfield High School in New Jersey allegedly created and spread deepfake porn images of Francesca Mani and other classmates without their consent. As a response, school principal Mary Asfendis sent a letter notifying the school community of the incident and inviting students to seek support from the school’s counselors. The school also launched an investigation, Mary Ann McGann, coordinator of school and community relations, wrote in an email to Stateline.

Francesca and her mother, Dorota, have been advocating for legislation that would protect girls in the future, Dorota Mani said in an interview.

Since the Westfield High incident, there have been news reports of middle- and high-school students in CaliforniaFlorida and Washington state becoming victims of deepfake pornography. The students — primarily girls — were allegedly targeted by their classmates, according to the reports.

The American Legislative Exchange Council, a conservative public policy organization, is promoting model language for state lawmakers to use that would target individual actors rather than technology developers. The Stop Deepfake CSAM Act is intended to supplement laws against child pornography, while the Stop Non-Consensual Distribution of Intimate Deepfake Media Act aims to bolster revenge porn laws.

“Artificial intelligence is a tool that can be used for good or used for ill,” said Jake Morabito, who heads a technology task force at the organization. “What we should be focusing on is harmful conduct use with AI. So, we should go after the bad actors and the harmful conduct, but don’t go after the people who are making the software.”

In Virginia, legislators realized that a revenge porn law enacted in 2014 was not enough to protect people who had been harmed by deepfake porn. As a result, state Del. Marcus Simon, a Democrat, helped pass an amendment in 2019 to include images that were artificially created.

“What duties do we owe to each other as good digital citizens?” Simon asked. “And what are the remedies for violating people? All of that will need to be worked out.”

This story first appeared on Stateline, a Georgia Recorder sibling outlet that is also part of the nonprofit States Newsroom network.

Georgia Recorder is part of States Newsroom, a nonprofit news network supported by grants and a coalition of donors as a 501c(3) public charity. Georgia Recorder maintains editorial independence. Contact Editor John McCosh for questions: info@georgiarecorder.com. Follow Georgia Recorder on Facebook and Twitter.

Be the first to comment on "States race to restrict deepfake porn as it becomes easier to create"

Leave a comment

Your email address will not be published.


*


This site uses Akismet to reduce spam. Learn how your comment data is processed.