A popular gamer on Twitch has tearfully revealed she is the latest high-profile victim of deepfake porn, with predators pasting her face into a pre-existing adult video to make it seem she legitimately appeared in the kinky clip.
QTCinderella, a 28-year-old American whose real name is Blaire, went live on the streaming site last week to lash out at the cyber sickos who made the video, as well as a prominent male Twitch star who had admitted to buying deepfake porn.
“I’m so exhausted and I think you guys need to know what pain looks like because this is it,” the gamer wept. “This is what it looks like to feel violated. This is what it feels like to be taken advantage of, this is what it looks like to see yourself naked against your will being spread all over the internet. This is what it looks like.”
She then took aim at gamer Atrioc who had earlier told fans that he purchased two doctored clips featuring other famous female Twitch stars, prompting a spike in traffic to the deepfake porn site.
“F–k the f–king internet. F–k Atroic for showing it to thousands of people. F–k the people DMing me pictures of myself from that website. F–k you all! This is what it looks like, this is what the pain looks like,” QTCinderella continued during her emotional livestream.
“To the person that made that website, I’m going to f–king sue you,” she vowed. “I promise you, with every part of my soul I’m going to f–king sue you.”
The Post has reached out to QTCinderella for further comment.
Given the rapid advancement of technology, it’s difficult to distinguish deepfake porn from legitimately filmed videos, adding to the distress experienced by victims who insist they weren’t party to the production. Laws have also not kept up with current online activity, meaning it may prove difficult for QTCinderella to sue the person who created the disturbing, doctored video.
Tech writer River Page first reported on the Twitch star’s deepfake horror. In his essay, republished by The Free Press, Page explained that “there is a federal revenge porn law that allows victims of nonconsensual porn to file lawsuits against perpetrators, but the law doesn’t address deepfakes specifically.”
“A federal law should be in place,” Page further wrote. “Will it stop deepfake porn? Not completely. Federal law hasn’t eliminated the production and distribution of child pornography either, but the enforcement of those laws has driven the practice to the extreme margins, and has attached a heavy cost to participating in the trade.”
At present, cyber pervs use software involving machine learning or artificial intelligence to create deepfakes with relative ease — and with little fear of prosecution.
Celebs such as Scarlett Johansson and Emma Watson have been the victims of deepfake porn videos, and some sleazebags are charging just $20 to create fabricated videos of exes, co-workers, friends, enemies and classmates.
Robert Chesney from the University of Texas and Danielle Citron from the University of Maryland have said the damage from being a victim of deepfake porn can be “profound.”
“Victims may feel humiliated and scared,” they wrote in a 2019 research paper. “When victims discover that they have been used in fake sex videos, the psychological damage may be profound — whether or not this was the aim of the creator of the video.”
The damage appears to be obvious QTCinderella, but some on social media have expressed little sympathy.
“I don’t get the big deal at all. Like, there could be terabytes of photoshopped porn of me, and I wouldn’t care… because I’m not actually experiencing those scenarios being depicted. It’s literally not real,” one Twitter user wrote.
“I get being upset But if you’re crying over people putting your face on a pornstar then maybe you’re too soft for the internet It’s wild out here,” another declared.