Deepfake Watch 23

Female Politicians Face The Nudify Menace | Deepfake Watch

Browser View | July 05, 2024 | Subscribe

The UK elections may have concluded peacefully, but there was a concerning trend. Some leading female politicians became victims of deepfake pornography in the run up to the polls, despite the UK having laws against sharing and creation of deepfake porn.

With many other countries yet to choose their leaders this year, female politicians are currently in the frontlines of the threat of deepfake porn, with their political careers potentially at stake.

Opt-in to receive the newsletter every Friday.

British female politicians bear the brunt of obscure nudify apps

Channel 4 News discovered a website dedicated to deepfake porn which included “400 digitally altered pictures of more than 30 high-profile UK politicians.”

The victims include leading politicians like Labour’s Deputy Leader Angela Rayner, Education Secretary Gillian Keegan, Conservative Commons Leader Penny Mordaunt, former home secretary Priti Patel, and Labour leader Stella Creasy.

Last year, the UK passed the Online Safety Act, which criminalised the sharing and distribution of sexually explicit deepfakes. Earlier this year, the Ministry of Justice announced yet another law that outlawed the creation of deepfakes, regardless of the intention to share it. However, there is a loophole - it must be proved that the accused intended to cause distress with the creation of that deepfake. And while the Online Safety Act is in effect, websites containing such non-consensual deepfakes continue to thrive.

Clare McGlynn, Professor of Law at Durham University, believes that such deepfakes “are being used to try to silence women politicians, to scare them from public office and speaking out.”

She also highlights the inaction by popular search engines, which she accuses of enabling the proliferation of such content. “Google and Bing are facilitating deepfake sexual abuse on an exponential scale. When are they going to down-rank or de-list these websites and nudify apps?” she asks on a LinkedIn post.

Earlier this year, US politician Alexandria Ocazio-Cortez had spoken out about the trauma of seeing a deepfake of herself, and has led the DEFIANCE Act - that lets victims sue distributors of deepfake pornography of them - in the House of Representatives.

Italian Prime Minister Giorgia Meloni is currently seeking US$100,000 in damages for sexually explicit deepfakes of being shared on the internet. Several media outlets reported that the police is investigating a 40-year-old man for the creation of Meloni’s deepfake, along with his 73-year-old father.

YouTube updates policy on deepfakes removal request

If you spot an AI-generated likeness of you on YouTube made without your consent, you can now request for the removal of such content

As part of its latest policy change around the AI concerns, YouTube will now allow users to request the removal of their non-consensual deepfakes from the platform.

“If someone has used AI to alter or create synthetic content that looks or sounds like you, you can ask for it to be removed. In order to qualify for removal, the content should depict a realistic altered or synthetic version of your likeness,” the policy reads.

Indian celebrities victims of deepfakes promoting betting apps

The internet is rife with deepfakes of Shah Rukh Khan, Virender Sehwag and Sonu Nigam, with videos showing them promoting illegal betting platforms.

Hera Rizwan, reporting for Decode, spoke to people who were lured into placing significant amounts of money through obscure Telegram channels to bet on cricket matches. All of them were bogus, but managed to draw people through deepfake ads. These ads are part of a larger trend where "match predictors" or "tippers" aggressively promote betting on cricket matches.

Loopholes in social media platforms' policies and the anonymity provided by platforms like Telegram contribute to the proliferation of such activities.

Have you been a victim of AI?

Have you been scammed by AI-generated videos or audio clips? Did you spot AI-generated nudes of yourself on the internet?

Decode is trying to document cases of abuse of AI, and would like to hear from you. If you are willing to share your experience, do reach out to us at [email protected]. Your privacy is important to us, and we shall preserve your anonymity.

About Decode and Deepfake Watch

Deepfake Watch is an initiative by Decode, dedicated to keeping you abreast of the latest developments in AI and its potential for misuse. Our goal is to foster an informed community capable of challenging digital deceptions and advocating for a transparent digital environment.

↗️ Was this forwarded to you?  Subscribe Now 

We invite you to join the conversation, share your experiences, and contribute to the collective effort to maintain the integrity of our digital landscape. Together, we can build a future where technology amplifies truth, not obscures it.

For inquiries, feedback, or contributions, reach out to us at [email protected]. 

🖤 Liked what you read? Give us a shoutout! 📢

↪️ Become A BOOM Member. Support Us!↪️ Stop.Verify.Share - Use Our Tipline: 7700906588

↪️ Follow Our WhatsApp Channel↪️ Join Our Community of TruthSeekers

Copyright (C) " target="_blank">unsubscribe