Poll shows huge majority of Americans favor criminalizing the sharing of fake AI-generated sexually explicit imagery

 February 7, 2024

A massive public outcry ensued in recent weeks after obviously fake and artificial intelligence-generated sexually explicit images of pop sensation Taylor Swift began to go viral online and circulate on social media.

Now, a recent poll has shown that approximately 75% of Americans favor the imposition of criminal charges against individuals who create and share so-called "deepfake" and nonconsensual pornography, the Daily Mail reported.

The poll, which coincides with multiple pieces of recently introduced bipartisan legislation to address the issue, included solid majorities in virtually every demographic category that were in favor of criminalizing such false and explicit imagery.

Public overwhelming supports criminalization

The Daily Mail joined forces this month with the pollsters at TIPP and asked 1,400 survey respondents whether they agreed or disagreed that "People who share deepfake porn online, like the explicit images of Taylor Swift, should face criminal charges."

Overall, 75% agreed with the criminalization of deepfake, AI-generated pornography, while 14% disagreed and 11% were unsure either way.

Support was highest among older Americans but still strong among the younger crowd -- 84% for those age 65+ compared to around 66% for those age 18-24 -- and was just a bit higher among Democrats than Republicans, with 81% favoring criminalization among the former and 71% among the latter.

Not a new problem, despite recent surge of attention

Interestingly enough, the Daily Mail reported that AI-generated deepfake porn involving celebrities is not a particularly new problem, as such false and nonconsensual imagery has been a thing online for years, though the issue certainly received a substantial boost in recent weeks after the fake Taylor Swift images went viral on social media.

Those particular images of Swift, which featured her in various sexually explicit positions while dressed in Kansas City Chiefs garb, were traced back to anonymous forums on 4chan, where countless other similarly fake images of dozens of other celebrities can be found and have been posted for years.

It was likely the move in January of those images of Swift from the relatively obscure forums to major social media platforms, not to mention significant improvements in the quality and increasingly realistic nature of AI-generated imagery, that caught the public's attention and spurred the incredible outcry.

Congress already trying to criminalize AI-generated fake sexual imagery

Lawmakers in Congress, as they tend to do in constantly gauging public opinion, moved quickly to seize on the moment to introduce legislation, including a bipartisan bill in the Senate Judiciary Committee that was dubbed the Disrupt Explicit Forged Images and Non-Consensual Edits Act of 2024, or DEFIANCE Act, that would create a federal avenue for victims to sue entities or individuals responsible for creating and spreading a "digital forgery."

"Sexually explicit 'deepfake' content is often used to exploit and harass women -- particularly public figures, politicians, and celebrities," Chairman Dick Durbin (D-IL), a sponsor of the bill, said in a statement. "This month, fake, sexually explicit images of Taylor Swift that were generated by artificial intelligence swept across social media platforms."

"Although the imagery may be fake, the harm to the victims from the distribution of sexually explicit 'deepfakes' is very real. Victims have lost their jobs, and they may suffer ongoing depression or anxiety," he added. "By introducing this legislation, we’re giving power back to the victims, cracking down on the distribution of ‘deepfake’ images, and holding those responsible for the images accountable."

Joining him as a bill sponsor was Sen. Josh Hawley (R-MO), who said, "Nobody -- neither celebrities nor ordinary Americans -- should ever have to find themselves featured in AI pornography," and added, "Innocent people have a right to defend their reputations and hold perpetrators accountable in court. This bill will make that a reality."

Another similarly bipartisan bill, the Preventing Deepfakes of Intimate Images Act, which would make the creation and sharing of fake nonconsensual sexual imagery a federal criminal offense, was introduced last month by Reps. Joe Morelle (D-NY) and Tom Kean (R-NJ), who chose not to jump on the Swift bandwagon but instead highlighted the real case of teenage high school girls in New Jersey who'd been victimized by fake AI-generated explicit imagery that was circulated among their classmates.

Latest News

© 2024 - Patriot News Alerts