Deepfake Arnold Wants You Hard

In partnership with

When you think of negative uses of AI, deepfakes will likely be at the top of the list. From unleashing a plethora of misogynistic digital sex crimes, to duping people of billions of dollars through advanced cyberattacks, deepfakes have provided power-ups to digital criminals of all sorts.

Sometimes it can be hilarious, and other times not quite so.

Sometimes It Wants You “Rock Hard”

“Your wife or girlfriend will thank you for watching this video. Go to your kitchen right now, grab a pinch of salt and try this simple fifteen second trick discovered by Harvard scientists. This easy trick will make your buddy rock hard without leaving home,” says actor-turned-politician Arnold Schwarzenegger in a sponsored video on YouTube. Has the Terminator been assigned to terminate erectile dysfunction now?

Nope. This is one of those snake oil ads, powered by deepfakes.

According to a recent investigative report by 404 Media’s Emanuel Maiberg, YouTube's advertising space has turned into an unauthorized celebrity showcase, where AI-generated versions of Hollywood's finest have been hawking erectile dysfunction supplements with surprising persistence.

These digital clones of Arnie, along with fellow ‘macho’ celebrities like Sylvester Stallone, Mike Tyson, and Terry Crews have been appearing in roughly 300 different ad variations, all pushing the same gobbledygook narrative. 

Each celebrity clone delivers an identical script about a revolutionary "salt trick" that supposedly keeps them "rock hard for hours," with the AI Terminator cheekily asking viewers if they "really thought adult actors last that long without a little hack."

404 Media dived deeper into the rabbit hole, and followed the ‘salt trail’ all the way to an obscure website called “thrivewithcuriousity.com.” They were treated to a 40-minute presentation that plays out like an epic B-movie plot on steroids: a journey spanning from Texas strip clubs to a Harvard urologist’s office, and all the way to abandoned Thai churches where mysterious bats have their thing up all day long. 

After cameos from deepfaked Tom Hanks, Denzel Washington and Johnny Sins, they finally arrived at the product being sold - Prolong Power, $49 per bottle. The label indicates that it is composed of “oat bran powder, fennel seed, cascara sagrada bark powderact, and other ingredients”, which 404 Media found were commonly used to treat constipation (according to National Library of Medicine). The article also noted that the label was missing the key ingredient mentioned in the ads - “midnight beetle powder” that kept the church bats horny.

The company’s website features several “verified” customer testimonials alongside profile pictures, that deepfake detection company Reality Defender identifies as 99% likely to be AI-generated.

While Google has now suspended these particular ads and permanently banned the advertiser, this isn't their first encounter with celebrity deepfake advertising - they had to purge about 1,000 similar scam ads just this January. Despite Google's claims of "constantly working to enhance enforcement systems," these digital imposters keep finding creative ways to slip through the cracks.

Remember folks: if an AI Arnold is promising you supernatural erections it's probably time to terminate that browser tab.

The AI Necromancy

Every November, households all across Mexico put up pictures of their deceased loved ones on a home altar, to celebrate the Day of the Dead. This year, things took an uncanny turn, as companies began offering AI-generated animations to decorate the ofrenda.

According to a report by Rest of World’s Daniela Dib, beer brand Cerveza Victoria and funeral service provider J. García López led this digital resurrection trend, with the latter receiving over 15,000 requests to animate photos of the departed. This is yet another in a long line of AI resurrection attempts that has been made around the world. 

During the Qingming festival (also called the tomb-sweeping festival) in China - which bears close resemblance to Mexico’s Día de los Muertos, Chinese AI companies started offering “moving digital avatars” of dear departed ones, for as little as 20 yuan ($2.75 or ₹233⟩. Those mourning the recently deceased are also using these services to assuage their grief. 

In India, political parties have used deepfake technology to bring back leaders like Karunanidhi, Jayalalithaa, and Buddhadeb Bhattacharjee in campaign materials, creating an emotional connection with party workers and voters through these digital apparitions.

However, as Daniela’s report points out, these high-tech séances raise serious concerns. Like India, Mexico has been struck with a rising spate of cybercrimes, with obsolete data protection laws that leave its citizens vulnerable. Experts expressed their concerns over the possibility of identity theft of deceased individuals to Rest of World, while Eon Institute's Claudia Del Pozo noted that these AI resurrections could risk disrupting traditional ways of remembering the dead, potentially blurring the line between memory and digital simulation.

In China, social media users used old videos of singer Qiao Renliang, who died by suicide in 2015, to create new videos of him, leading to outrage from Renliang’s parents. His father told Chinese media that the video was made without their consent, and that it was “rubbing salt into their wounds.”

Remembering the dead is important, but it should not get in the way of letting go.

The Battles Against Deepfakes

Deepfakes are turning out to be quite a nuisance. Apart from scamming unsuspecting men into buying pills to get bat-like erections, it poses a serious threat to democracies, and governance. 

When South Korea’s opposition leader saw a video of the country’s president announcing martial law, he thought it was a deepfake.

Fortunately, this threat has been noted around the world, and steps are being taken to fight and regulate this space. 

Hive, a lead actor in deepfake detection, recently secured a two-year contract from the US Department of Defence to provide its deepfake detection capabilities to sniff out AI-generated video, image, and audio content.

While the company has worked with major platforms like Reddit, Zynga, and BlueSky, this Department of Defense engagement will present new challenges! 

A recent report by the Data Security Council of India (DSCI) and Seqrite, predicts that AI-led and deepfake-enable cyberattacks are going to get a lot more sophisticated and intense. 

The India Cyber Threat Report 2025 by the Data Security Council of India (DSCI) and Seqrite, spotlighted the evolving tactics of cybercriminals and the rise of AI-driven attacks as a major concern.

Was this forwarded to you?

MESSAGE FROM OUR SPONSOR

Receive Honest News Today

Join over 4 million Americans who start their day with 1440 – your daily digest for unbiased, fact-centric news. From politics to sports, we cover it all by analyzing over 100 sources. Our concise, 5-minute read lands in your inbox each morning at no cost. Experience news without the noise; let 1440 help you make up your own mind. Sign up now and invite your friends and family to be part of the informed.

Have you been a victim of AI?

Have you been scammed by AI-generated videos or audio clips? Did you spot AI-generated nudes of yourself on the internet?

Decode is trying to document cases of abuse of AI, and would like to hear from you. If you are willing to share your experience, do reach out to us at [email protected]. Your privacy is important to us, and we shall preserve your anonymity.

About Decode and Deepfake Watch

Deepfake Watch is an initiative by Decode, dedicated to keeping you abreast of the latest developments in AI and its potential for misuse. Our goal is to foster an informed community capable of challenging digital deceptions and advocating for a transparent digital environment.

We invite you to join the conversation, share your experiences, and contribute to the collective effort to maintain the integrity of our digital landscape. Together, we can build a future where technology amplifies truth, not obscures it.

For inquiries, feedback, or contributions, reach out to us at [email protected].

🖤 Liked what you read? Give us a shoutout! 📢

↪️ Become A BOOM Member. Support Us!

↪️ Stop.Verify.Share - Use Our Tipline: 7700906588

↪️ Follow Our WhatsApp Channel