Tackling Deepfake Pornography: How US States are Pioneering New Laws

As national legislation concerning deepfake pornography slowly progresses through Congress, states throughout the U.S. are taking action independently. A total of thirty-nine states have enacted a variety of laws aimed at curtailing the production of nonconsensual deepfakes and punishing those who create and distribute them.

Earlier in the year, Democratic congresswoman Alexandria Ocasio-Cortez, who has herself been a target of nonconsensual deepfakes, proposed the Disrupt Explicit Forged Images and Non-Consensual Edits Act, known as the Defiance Act. This legislation, if enacted, would allow individuals affected by deepfake pornography to initiate lawsuits provided they can demonstrate the content was created without their permission. In June, Republican senator Ted Cruz introduced the Take It Down Act, which aims to compel platforms to eliminate both revenge porn and nonconsensual deepfake pornography.

While many of these initiatives enjoy bipartisan support, federal bills typically require extensive time to pass through both chambers of Congress and be enacted into law. However, state governments and local officials have the ability to act more swiftly and are taking steps to do so.

To date, 39 states have proposed legislation targeted at nonconsensual deepfakes, with 23 states enacting laws, four contemplating pending bills, and nine rejecting such proposals.

Last month, San Francisco City Attorney David Chiu’s office announced a lawsuit against 16 top websites that enable the creation of AI-generated pornography. “Generative AI holds great potential; however, like all technologies, it also comes with pitfalls and those eager to misuse it. It is essential to recognize that this isn’t innovation—this is a form of sexual abuse,” Chiu remarked in his office’s statement.

The lawsuit is among several efforts to address the rising problem of nonconsensual deepfake pornography.

“Many think only celebrities are impacted by this issue,” mentions Ilana Beller, organizing manager at Public Citizen, a group monitoring nonconsensual deepfake legislation and sharing insights with WIRED. “However, many ordinary individuals also suffer from these violations.”

According to data from Public Citizen, 23 states have enacted legislation against nonconsensual deepfakes. “This widespread problem is now acknowledged by state lawmakers as needing urgent regulation,” Beller notes. “There’s also a surge in interest among legislators to craft AI policies, given the rapid advancement of this technology.”

Last year, WIRED reported that deepfake pornography is only increasing, and researchers estimate that 90 percent of deepfake videos are of porn, the vast majority of which is nonconsensual porn of women. But despite how pervasive the issue is, Kaylee Williams, a researcher at Columbia University who has been tracking nonconsensual deepfake legislation, says she has seen legislators more focused on political deepfakes.

“More states are interested in protecting electoral integrity in that way than they are in dealing with the intimate image question,” she says.

Matthew Bierlein, a Republican state representative in Michigan, who cosponsored the state’s package of nonconsensual deepfake bills, says that he initially came to the issue after exploring legislation on political deepfakes. “Our plan was to make [political deepfakes] a campaign finance violation if you didn’t put disclaimers on them to notify the public.” Through his work on political deepfakes, Bierlein says, he began working with Democratic representative Penelope Tsernoglou, who helped spearhead the nonconsensual deepfake bills.

At the time in January, nonconsensual deepfakes of Taylor Swift had just gone viral, and the subject was widely covered in the news. “We thought that the opportunity was the right time to be able to do something,” Beirlein says. And Beirlein says that he felt Michigan was in the position to be a regional leader in the Midwest, because, unlike some of its neighbors, it has a full-time legislature with well-paid staffers (most states don’t). “We understand that it’s a bigger issue than just a Michigan issue. But a lot of things can start at the state level,” he says. “If we get this done, then maybe Ohio adopts this in their legislative session, maybe Indiana adopts something similar, or Illinois, and that can make enforcement easier.”

Penalties for creating and sharing nonconsensual deepfakes vary significantly across different states, creating an inconsistent landscape within the US. “The US landscape is just wildly inconsistent on this issue,” states Williams. “There seems to be a misconception that numerous laws are being enacted nationwide; however, what’s actually happening is a multitude of laws being proposed.”

In some states, it’s possible to pursue both civil and criminal actions against those who create and distribute deepfakes, but in others, legal remedies may be limited. For example, recent legislation in Mississippi specifically targets the creation of deepfakes involving minors. Over recent months, there have been numerous cases of middle and high school students using AI to create explicit imagery of peers, particularly targeting female students. Other states are updating laws aimed at adults, modifying existing statutes that address revenge pornography.

While there is widespread agreement on the immoral nature of nonconsensual deepfakes involving minors, views on similar abuses involving adults are more divided. Often, legislation around adult deepfakes requires proving malicious intent on the part of the creator, which complicates legal proceedings.

Online environments complicate enforcement of these laws, as noted by Sara Jodka, a privacy and cybersecurity expert. She points out, “If you cannot determine the identity of someone behind an IP address, proving their identity and intent becomes nearly impossible.”

Williams points out that creators of nonconsensual deepfakes often do not perceive their actions as harmful, rationalizing them as fan content born out of admiration and attraction. “This is fan content,” they claim, according to her.

Jobka argues that while state laws are a positive beginning, their effectiveness is limited. She emphasizes the necessity of federal legislation to tackle nonconsensual deepfakes effectively through interstate investigations and prosecutions. “States don’t really have a lot of ability to track down across state lines internationally,” she observes, highlighting the rarity and specificity required for enforcing such laws.

However, Bierlein from Michigan highlights the proactive stance of many state representatives who are not willing to wait for federal intervention. He is particularly concerned about the role of nonconsensual deepfakes in sextortion scams, which the FBI indicates is escalating. In 2023, a tragic incident occurred where a Michigan teen took his own life after being threatened by scammers with the release of his intimate photos. “Things move really slow on a federal level, and if we waited for them to do something, we could be waiting a lot longer,” he stated.

Total
0
Shares
Leave a Reply

Your email address will not be published. Required fields are marked *

Previous Article

Dive Into Adventure: How This Dungeon Crawl-Themed Book Series Brings Video Game Excitement to the Page

Next Article

YubiKeys: The Security Gold Standard with a Cloning Vulnerability

Related Posts