How was your first experience discovering the first deepfake videos? This technology where increasingly public Artificial Intelligence platforms are used to embed anyone’s face on someone else’s body became very popular years ago.
At the time of its unfortunate fame, the vast majority discovered this technology when it was applied to quite explicit adult video content and still with very convincing results. Enough at least to doubt for a moment whether it was real or not.
Due to the nature of the situation of its origin, most of these materials included the faces of celebrities, almost all of them actresses, who participated in all kinds of sexual activities. Which ended up making quite a bit of noise on the internet, especially as the plausibility of each fake video became more refined.
The issue then escalated further, when users on platforms like TikTok they began using this technology to create fake look-alikes of other celebrities, now shown in more mundane situations.
The technology became popular and mainstream with shows like The Mandalorian taking advantage of this technology to rejuvenate Luke Skywalker.
It seemed that the murky origin of deepfake videos had been left behind. But now a worrying trend is beginning to emerge.
Where now these types of fake pornographic videos are beginning to emerge, starring not celebrities, but anyone who has posed in front of a camera.
This is the story of Kate Isaacs and the deepfake videos starring anyone
The BBC He has just published an extensive and disturbing article, where he tells the story of Kate Isaacs, a British activist with years of fighting against pornographic videos spread without consent.
Who, as a result of this fight, was recently the victim of a harassment campaign where they began to spread deepfake videos with explicit sexual content starring her:
“I panicked. Someone grabbed my face, put it in a porn video, and they made it look like it was me. My heart sank to my feet.
I couldn’t think clearly. I remember thinking that video would go everywhere. It was horrible. It was a rape. They had used my identity in a way that I had not consented to.”
In itself, as described by the BBC itself, like “revenge porn”, deepfake pornography is considered a new kind of sexual abuse based on images.
This term is now used to encompass the non-consensual taking, creating and/or sharing of intimate images, including those generated virtually by artificial intelligence.
The same article interviews subjects specialized in the realization of this kind of materials on request, where they relate how it is enough to have some photographs stolen from social networks.
Or even with video captures of frontal headshots taken from Zoom video conferences they have been able to achieve a convincing deepfake video.
This is clearly a phenomenon that no longer only affects celebrities.