E-MAIL THIS LINK
To: 

Deepfaked: ‘They put my face on a p0rn video'
[BBC] In the past, high-profile celebrities and politicians were the most common targets of deepfakes - the videos weren't always porn, some were made for comedic value.
[Biden stumbling about on the stage?]
But over the years that's changed - according to cybersecurity company Deeptrace, 96% of all deepfakes are non-consensual porn.

Like revenge porn, deepfake pornography is what's known as image-based sexual abuse - an umbrella term which encompasses the taking, making and/or sharing of intimate images without consent.

It is already an offence in Scotland to share images or videos that show another person in an intimate situation without their consent. But in other parts of the UK, it's only an offence if it can be proved that such actions were intended to cause the victim distress - a loophole which means video creators often don't face legal consequences.

Government plans for a long-awaited UK-wide Online Safety Bill have been under endless revision and repeatedly shelved. The new laws would give the regulator, Ofcom, the power to take action against any website deemed to be enabling harm to UK users, no matter where they are based in the world. Earlier this month, however, Culture Secretary Michelle Donelan said she and her team were now "working flat out" to ensure the bill was delivered.

Kate, 30, founded the #NotYourPorn campaign in 2019. A year later, her activism contributed to the adult entertainment website, Pornhub, having to take down all videos uploaded to the site by unverified users - the majority of its content.

Kate therefore assumed that whoever was behind the deepfake of her had been annoyed by her campaigning. She had "taken away their porn".

But she had no idea who that person was, or who might have seen the video. And while she could see that her face had been overlaid onto footage of a porn actor, the video was convincing enough that she worried others might not spot the deception.

"It was a violation - my identity was used in a way I didn't consent to."

Underneath the video, people began leaving streams of abusive comments, saying they were going to follow Kate home, rape her, film the attack, and publish the footage on the internet.

"You start thinking about your family," she says, holding back tears. "How would they feel if they saw this content?"

The threat intensified when both Kate's home and work addresses were published below the video - a practice known as doxing.

A colleague reported the video, vicious comments and doxing to Twitter, and they were all taken down from the platform. But once any deepfake has been published and shared online it's difficult to remove it from circulation entirely.

"I just wanted that video off the internet," Kate says, "but there was nothing I could do about it."

There's a marketplace for deepfakes in online forums. People post requests for videos to be made of their wives, neighbours and co-workers and - unfathomable as it might seem - even their mothers, daughters and cousins.

Content creators respond with step-by-step instructions - what source material they'll need, advice on which filming angles work best, and price tags for the work.

The standard of deepfakes can vary wildly, and depends both on the expertise of the person who made the video and the sophistication of the technology used.

But the man behind the largest deepfake porn website admits it's no longer easy to know for certain whether you're looking at manipulated images or not. His site attracts about 13 million visitors a month and hosts roughly 20,000 videos at any one time. He is based in the US and rarely speaks to the media - but he agreed to talk to the BBC anonymously.

Deepfaking "ordinary" women is a red line for him, he says, but in his view, hosting pornographic deepfake videos of celebrities, social media influencers and politicians, is justifiable.

"They're accustomed to negative media, their content is available in the mainstream. They're different to normal citizens," he says.

"The way I see it, they are able to deal with it in a different way - they can just brush it off. I don't really feel consent is required - it's a fantasy, it's not real."

Does he think what he's doing is wrong? Part of him is "in denial about the impact on women", he admits - and notably, he reveals that his spouse doesn't know what he does for a living.

"I haven't told my wife. I'm afraid of how it might affect her."

Until relatively recently, deepfake software wasn't easily available, and the average person wouldn't have had the skills to make them. But now, anyone over the age of 12 can legally download dozens of apps and make convincing deepfakes in a few clicks.

For Kate that's worrying and "really scary".

"It's not the dark web, it's in the app stores - right in front of our faces."

She also fears the hoped-for Online Safety Bill won't keep up with technology. Three years ago, when the bill was first-drafted, deepfake-creation was seen as a professional skill in which someone would need to be trained - not merely download an app.

"We're years down the line and the contents of [the bill] are out of date - there's so much missing," she says.

But for creator Gorkem, criminalising deepfaking would change things.

"If I could be traced online I would stop there and probably find another hobby," he says.

Posted by: Skidmark 2022-10-22
http://www.rantburg.com/poparticle.php?ID=647382