Written by Hannah Ellman, CCASA Policy Intern
You’ve probably heard the word “deepfake” in the news in connection with big names like Taylor Swift, Scarlett Johansson, or Selena Gomez, who have been targeted with nonconsensual, sexual images online. While these stories grab headlines, this kind of abuse reaches far beyond the world of celebrities.
As artificial intelligence becomes easier to access and more integrated into our daily lives, there’s been a sharp rise in the number of nonconsensual, sexually explicit digitally created or manipulated images online. This is one form of image-based sexual abuse (IBSA), and it’s causing real harm.
Now, as IBSA becomes more widespread, it is more important than ever that we understand what it is, how it impacts survivors, and what protections exist to help stop it.
What is IBSA?
Think of IBSA as an umbrella term for a range of harmful experiences that involve the creation, manipulation, collection, or sharing of sexually explicit materials without the consent of the person depicted.
Here’s a quick overview of some common forms of IBSA, along with some popular media terms you might recognize:*
- Nonconsensual distribution of sexually explicit material (often called “revenge porn”): sharing sexually explicit or sexualized images or videos—whether real, AI-generated, or digitally altered—without the consent of the person depicted.
- Sexual extortion (or “sextortion”): using real or digitally altered sexual images or videos to blackmail, coerce, or threaten someone into doing something, such as sexual acts, sending explicit images, or paying money.
- Video voyeurism (or “spycamming”; includes “down-blousing” and “upskirting”): secretly recording images of people in public or private spaces without their knowledge.
- Digitally altered/manipulated sexual material (often called “deepfake” or “cheap fake” pornography): altering an existing sexual image or video using AI or other tools, like placing someone’s face or body into a pornographic scene or using a “nudify” app to remove clothing from an existing image (a growing issue among U.S. youth).
- AI-generated sexual material: creating new sexual content using artificial intelligence, often based on databases of real people’s images.
*This is not a comprehensive list. For a deeper dive, we recommend the National Center on Sexual Exploitation’s helpful resource, “Identifying Image-Based Sexual Abuse: Classifications and Definitions.”
A note on the language we use
One of the most widely recognized terms you’ll see in reference to IBSA is “revenge porn.” While the phrase might grab attention, it is also misleading and harmful, implying that pornography can be nonconsensual and that survivors somehow did something to deserve “revenge.” This framing suggests a line of thinking that results in victim-blaming and minimizes abuse.
Similarly, new legislation has relied on phrases like “nonconsensual intimate imagery (NCII)” and “intimate digital depictions” in reference to IBSA. While we strongly support the legal efforts to expand protections, we do not use the word “intimate” in connection with this kind of abuse. There is nothing intimate about IBSA.
So how prevalent is IBSA – really? And why does this matter?
The first ever nation-wide study from the Cyber Civil Rights Institute (2017) found that 1 in 8 U.S. adults had either been threatened with or had a sexual photo shared without their consent. Within those statistics, women are 1.7x as likely to be targeted, with even higher rates for LGBTQIA+ and gender-expansive people.
But the harm doesn’t stop once the threat is made or image is shared. Survivors often experience ongoing harassment, loss of trust, and interruptions to school, work, and daily life. Further, the psychological impacts can be devastating, resulting in increased rates of depression, feelings of isolation, anxiety, self-harm, and risk of suicidality. For many survivors, especially those from marginalized communities, this trauma is often compounded by barriers to justice.
What you can do:
Now for the good news: new state and federal legislation this year has expanded protections for survivors, updating definitions of IBSA and resulting pathways to justice to include nonconsensual AI-generated and digitally altered sexual images. These changes send an important message: when it comes to nonconsensual sexual material, fake is not harmless. These changes have also helped raise awareness and increase visibility around resources available to support survivors.
If you or someone you know has experienced image-based sexual abuse, know that help is available. We invite you to explore the resources below to learn more.
To survivors of image-based sexual abuse: CCASA stands with you. You are not alone. You deserve safety, dignity, and justice.
Your body. Your image. Your choice. Always.
Learn more:
- National Center on Sexual Exploitation – Image-Based Sexual Abuse: learn more about image-based sexual abuse and the forms that IBSA can take.
- Cyber Civil Rights Initiative (CCRI) Safety Center: step-by-step guide for what to consider if you or someone you know is impacted by IBSA.
- From the Cyber Civil Rights Initiative (CCRI): “Empowering the Bystander: What to Do if You’ve Witnessed Image-Based Sexual Abuse”: resource specifically for individuals who come across incidents IBSA.
The views, thoughts, and opinions expressed in this blog belong solely to the individual authors and do not necessarily reflect the official policy or position of the Colorado Coalition Against Sexual Assault (CCASA). The content is provided for informational and educational purposes only and should not be construed as legal, medical, or professional advice. CCASA appreciates the diverse voices that contribute to this platform and encourages thoughtful dialogue, but cannot guarantee the completeness or accuracy of all shared content.