Nonconsensual Pornography: How (In)Action Informs Culture

While it's nothing new to see Taylor Swift in the headlines, you may have seen or heard some different types of headlines about her in the last week.

Last Thursday, AI-generated pornographic deepfakes — images that are algorithmically created or altered to look extremely realistic — depicting Swift began circulating on X, the platform formerly known as Twitter. Swift did not create, or consent to the release and wide circulation of, these explicit images. While some may identify this type of attack as "revenge porn", the accurate term for this type of violating and violent online attack is "nonconsensual pornography."

Sidenote: The term “revenge porn” should be made obsolete. Not only does it imply that perhaps the person being targeted did something to ‘deserve’ such an attack, but the phrase is also a misnomer because offenders are often motivated by something other than revenge, such as attention, extortion/financial gain, or simply malice.

It's no accident that these nonconsensual images are sexual in nature. Sexual violence is still the violence of choice in attacking, exploiting, and/or harassing women. In 2023, a study by research firm Home Security Heroes found that 99% of deepfake targets are women, and deepfake porn makes up 98% of all deepfake videos online. While public figures, politicians, and celebrities bear the brunt of these online attacks, the risk of harm to children, young adults, and civilians remains high as well.

This proliferating threat must be addressed quickly, before it gets out of hand. With the advent of generative AI, the potential for harm — particularly to women and marginalized groups, who are already the primary targets of most technology-facilitated violence — is only growing.

  • Platform Accountability: Meta, X, Discord, Snap, TikTok, etc. have to-date shown little interest in addressing issues of online violence and pornography on their platforms. However, much like how they are now being held to account in addressing child pornography amidst a staggering increase in reports of online child enticement, they must similarly be held to account to act in eliminating nonconsensual pornography and other forms of technology-facilitated gender-based violence on their platforms. The platforms must be held accountable for what their algorithms promote and proliferate, and they must be held accountable for enforcing their community rules and guidelines.

  • Legislative Action: This week, US lawmakers have proposed letting people sue over faked pornographic images of themselves. The Disrupt Explicit Forged Images and Non-Consensual Edits (DEFIANCE) Act would add a civil right of action for intimate “digital forgeries” depicting an identifiable person without their consent, allowing victims to sue anyone who “knowingly produced or possessed” the image with the intent to spread it. This type of legislation is certainly a step in the right direction in deterring attacks by imposing a cost on perpetrators.

  • Individual Action: This HuffPost piece has a 5-point list of things to do and consider, including image sourcing, attack documentation, and retaining legal counsel.

Taylor Swift’s impact has been catalytic in the past. Her 2023 Eras Tour singlehandedly boosted the US economy by nearly $6 billion dollars, a single Instagram post led to 35,000 new voter registrations, and her effort to regain control of her music catalog has been inspirationally successful. One can only hope that the Taylor Swift Touch will be equally catalytic in spurring action to make a safer world, online and offline.

genEquality