Someone sends you a message: “You need to see this, I’m sorry,” followed by a link. What pops up is your own face, looking back at you as you engage in acts of hardcore pornography. Of course, it’s not really you. But it is your likeness; an image of you that has been mapped onto a video of someone else using AI technology. This is what’s known as a “deepfake.” It’s happening across the globe—to actors, politicians, YouTubers and regular women—and in most countries, it’s entirely legal.
In 2017, a Reddit user called “deepfakes” made a thread where people could watch fake videos of “Maisie Williams” or “Taylor Swift” having sex. Before the thread was shut down, eight weeks later, it had amassed 90,000 subscribers.
According to cybersecurity firm Sensity, deepfakes are growing exponentially, doubling every six months. Of the 85,000 circulating online, 90 percent depict non-consensual porn featuring women. As for the creators, a quick look at the top 30 on one site reveals deepfakers all over the world, including in the U.S., Canada, Guatemala and India. Of those who list their gender, all except one are male.
Last October, British writer Helen was alerted to a series of deepfakes on a porn site that appeared to show her engaging in extreme acts of sexual violence. That night, the images replayed themselves over and over in horrific nightmares and she was gripped by an all-consuming feeling of dread. “It’s like you’re in a tunnel, going further and further into this enclosed space, where there’s no light,” she tells Vogue. This feeling pervaded Helen’s life. Whenever she left the house, she felt exposed. On runs, she experienced panic attacks. Helen still has no idea who did this to her.
These videos may be fake, but their emotional impacts are real. Victims are left with multiple unknowns: who made them? Who has seen them? How can they be contained? Because once something is online, it can reappear at any moment.
The silencing effect
Amnesty International has been investigating the effects of abuse against women on Twitter, specifically in relation to how they act online thereafter. According to the charity, abuse creates what they’ve called “the silencing effect” where women feel discouraged from participating online. The same can be said for victims of deepfakes.