‘The law is really slow in catching up’: Woman fights for justice after friend made deepfake porn of her

by · LBC

Exclusive

‘The law is really slow in catching up’: Woman fights for justice after friend made deepfake porn of her.Picture: Alamy

By Sofia Luis-Hobbs and Katy Ronkin

A woman whose best friend created non-consensual deepfake pornography of her is campaigning for legal reform.

Listen to this article

Loading audio...

Jodie* received an anonymous email in March 2021. When she clicked a link in the message, she found content – which looked to be her - having sex with different men.

They were deepfakes.

These are digitally altered images and videos made using artificial intelligence. In Jodie's case, her likeness had been used to create pornographic content.

What she didn't know at the time was that one of her best friends was responsible.

Read more: London nightclub Heaven has license suspended over claims woman was raped by 'a member of security staff'

Read more: Medical misogyny leaves women facing years of 'needless pain', MPs warn

Laura Farris: 'It has catastrophic psychological consequences for women, so we're making it a crime today'

"There were probably about ten [deep fakes] posted by a few different users. There were a couple of videos, and the rest were photos.

"There was a school-girl photo, actually there was a few. I was being anally penetrated in one of them ... it was just really horrifying".

Since then, Jodie has campaigned for a change in the law.

On 13th December 2024, the House of Lords will hear the second read of a private members bill to create offences for the creation and solicitation of non-consensual sexually explicit images of videos.

This proposed amendment to the Sexual Offences Act 2003 has been brought to the Lords by Baroness Owen of Alderly Edge after a long campaign by activists.

Baroness Charlotte Owen.Picture: Alamy

"Thousands of these images are being made every day, causing havoc and harm to mostly women and girls," said Clare McGlynn, a Professor of Law at Durham University.

"Action should've been taken a number of years ago because we've known about sexually explicit deepfakes since 2017. The law is really slow in catching up here."

Jodie told LBC that the concerning online activity started before she was alerted to the deepfakes. She said they were "the tip of an iceberg."

Three months before, she received her first anonymous email, which contained a link to a forum that had images of women, including her and her friends, asking other users to comment on their bodies.

After searching online, she had also come across fake social media profiles soliciting sex.

Jodie hadn't set up these profiles, but they had real photos of her, her real name, and her location.

Not seeming to be getting anywhere with getting the content removed at the time, Jodie chose to ignore it.

She was about to start a new job and wanted to put the situation behind her.

But in March, after she saw the digitally altered deepfake images of what looked to be her having sex with men, she decided to act.

"Thousands of these images are being made every day, causing havoc and harm to mostly women and girls," said Clare McGlynn, a Professor of Law at Durham University.Picture: LBC

After a three-hour meeting with the City of London Police, Jodie was turned away after being told that no crime had been committed.

She then approached the Metropolitan Police who took on the case.

"I wasn't going to put up with being dismissed again because I knew that it was wrong."

By this point, Jodie knew who her perpetrator was.

He'd solicited another deepfake to be made. This time, it was from a photo they'd taken on a day out together that Jodie had only shared with him.

Finally knowing who the perpetrator was, Jodie said she was able to explain the situation to two female police officers.

She said: "I felt that I was taken seriously, and I felt they were going to do something about it.

"But the reality of dealing with the Met Police after that was very different and there was a complete lack of support.

She told LBC that she was not assigned a liaison officer and was not given a phone number to get an update on the case.

There were "numerous emails of me just trying to get updates because weeks would go past and I'd have nothing….and he was still posting [explicit deepfakes] during this time.

Jodie also told us that she had to compile the evidence herself. Having to trawl back through lots of traumatising content and put it together in a file for the police.

Professor McGlynn said: "It shows the police were ill-equipped.

"[Jodie] had to gather the evidence herself to present to the police and she had to keep insisting that action was taken.

"We can't expect every survivor to have to do that. It's the police's job to investigate."

Detective Superintendent Vicky Tunstall from the Metropolitan Police said: "We are dedicated to protecting women and girls from predatory offenders and are transforming how we protect the public from harm.

"We would urge anyone who is unhappy with how their case is handled to get in touch so we can address any dissatisfaction, and continue to improve our service.

Detective Chief Superintendent, Head of Specialist Operations at City of London Police Mandy Horsburgh said: "We apologise for any distress to the victim in our initial assessment and response.

"We will take every opportunity to learn from this."

Jodie's friend was convicted – but not for the deepfake images.

He was convicted of sending messages that were grossly offensive or of an indecent, obscene or menacing nature over a public electronic communications network.

He was given a 20-week sentence suspended for 24 months.

The bill would make not only the creation but the solicitation of non-consensual deepfake porn illegal, meaning perpetrators like Jodie's friend could be convicted if the amendments are brought in.

The Revenge Porn Helpline said they received a low number of complaints for deepfake image abuse in 2023 but have also acknowledged "a lack of awareness and under-reporting could play a part" in this.

End Violence Against Women Coalition director Andrea Simon told LBC: "It's been called an invisible threat because many victims may be completely unaware that they've been targeted at all.

"Sometimes when they do become aware that they've been targeted for deepfake sexual abuse they don't even know the extent of how far those images gave been shared or where they've been hosted across the internet.

The 46 complaints the Revenge Porn Helpline received last year was still a 119% increase on the year before, which they say indicates a "concerning increasing trend in synthetic content."

A spokesperson for the National Police Chiefs' Council said: "Creating and sharing sexually explicit images of someone without their consent, including deepfakes, is deeply pervasive and can have a traumatic impact on victims.

"It's important that victims understand that these are serious crimes and feel confident in reporting offences to the police so that action can be taken.

Jodie told LBC her campaign work goes beyond her experience: "This is about women everywhere who have been silenced, violated and exploited online and offline.

"This is about standing up for justice and protecting women's rights. If you'd like to support, please consider signing the petition."

*Name changed to protect identity.