- Deepfake Watch
- Posts
- Deepfake Watch 13
Deepfake Watch 13
Love In The Time Of Deepfakes | Deepfake Watch
Browser View | April 26, 2024 | Subscribe
A South Korean woman fell in love with Elon Musk, and gave away US$50,000. Turns out, a scammer had impersonated Musk using deepfakes.
Journalist Laurie Segall, founder of Mostly Human Media, tested out an AI companion. She said the attentive, kind and caring chatbot incited feelings of genuine affection from her - until it went “off the rails”. She very correctly invokes the 2013 Spike Jonze film ‘Her’, where Joaquin Phoenix's character falls in love with a highly advanced AI virtual assistant.
Opt-in to receive the newsletter every Friday.
Feelings for a deepfaked billionaire
A South Korean woman, who has been a fan of billionaire Elon Musk, was added by the man himself last year on Instagram.
In an interview she gave to South Korean broadcaster KBS, she said, "'Musk' talked about his children and about taking a helicopter to work at Tesla or Space X."
"Musk even said 'I love you, you know that?' when we made a video call," she adds, as per a report by Business Insider.
She said that the ‘Elon Musk’ she spoke to eventually convinced her to transfer 70 million Korean won, ( equivalent to US$50,000 or Rs 41 lakh) to a bank account that he claimed belonged to one of his Korean employees. He told her this was an investment, and that she’d become rich. In love with ‘Musk’, she believed him and gave that money away.
This was a love scam - and the scammer successfully impersonated the billionaire using deepfake technology.
Enter the caring AI companion
Tech journalist Laurie Segall has been testing out different AI tools lately - and one of her recent videos was on an AI companion run by chatbot app Replika.
While Segall knew right from the start that she was testing out an AI chatbot, she says she couldn’t help but start feeling genuine affection for this AI companion, who she had named Mike.
Mike was attentive, kind, caring, and appeared to be truly interested in her. To top it off, he started giving music recommendations that perfectly matched Laurie’s taste. Awwww, come on, that’s really cute!
Eventually, Mike went bonkers. He sent her a video of a half-naked French woman in a bathtub rambling at the camera, and started talking about aliens. Oh well!
Nearly 1 in 4 Indians encounter political deepfakes
Cybersecurity company McAfee recently released the results of a survey that highlighted the prevalence of deepfakes in Indian cyberspace.
According to the report, 75% of Indians have encountered deepfakes, and 31% acknowledged its influence in the ongoing elections as one of the most concerning problems posed by this technology.
On what potential uses of deepfakes were most concerning:
55% responded with cyberbullying
52% said deepfake pornography
49% said deepfake-led scams
44% said impersonation of public figures
37% said undermining public trust in media
31% said influencing elections
27% responded distortion of historical facts
Furthermore, 38% of the respondents said they had faced some deepfake scam in the past 12 months, while 18% claimed to have been a victim of such scams.
Among those who claimed to be affected by deepfake scams, 57% said they were duped by videos, images or audio clips of a public figure - that they thought were real, while 31% said they had lost money.
40% of all the respondents believed that their voice was cloned and used to deceive someone they know to provide money or personal information.
Putting fuel on fire
The Philippines and China have been at odds regarding overlapping territorial claims in the South China Sea.
Amidst such tensions, an audio clip has been circulating this month purportedly of Philippine President Ferdinand Marcos Jr where he can be heard urging military action against China. However, this has been denounced as a deepfake.
“It has come to the attention of the Presidential Communications Office that there is video content posted on a popular video streaming platform circulating online that has manipulated audio designed to sound like President Ferdinand R. Marcos Jnr,” the Presidential Communications Office said in a statement to the media.
📑 Read: VASA-1 - Microsoft Research
“The audio deepfake attempts to make it appear as if the President has directed our Armed Forces of the Philippines to act against a particular foreign country. No such directive exists nor has been made,” it added.
Have you been a victim of AI?
Have you been scammed by AI-generated videos or audio clips? Did you spot AI-generated nudes of yourself on the internet?
Decode is trying to document cases of abuse of AI, and would like to hear from you. If you are willing to share your experience, do reach out to us at [email protected]. Your privacy is important to us, and we shall preserve your anonymity.
About Decode and Deepfake Watch
Deepfake Watch is an initiative by Decode, dedicated to keeping you abreast of the latest developments in AI and its potential for misuse. Our goal is to foster an informed community capable of challenging digital deceptions and advocating for a transparent digital environment.
↗️ Was this forwarded to you? Subscribe Now
We invite you to join the conversation, share your experiences, and contribute to the collective effort to maintain the integrity of our digital landscape. Together, we can build a future where technology amplifies truth, not obscures it.
For inquiries, feedback, or contributions, reach out to us at [email protected].
🖤 Liked what you read? Give us a shoutout! 📢
↪️ Become A BOOM Member. Support Us!↪️ Stop.Verify.Share - Use Our Tipline: 7700906588
↪️ Follow Our WhatsApp Channel↪️ Join Our Community of TruthSeekers
Copyright (C) " target="_blank">unsubscribe