- Deepfake Watch
- Posts
- A New Privacy Nightmare
A New Privacy Nightmare
Cops and stalkers have always tried to use the latest technology to keep a tab on their people of interest. Whether it is internet chat forums, social media or open-source investigative tricks, they’ve figured out new ways to track their targets.
Now, all that can be automated at scale.
I spy.. You spy.. GeoSpy
Working in a fact-checking newsroom entails plenty of geolocation - locating photos and videos through visual cues. And so, I had come across Geospy - a nifty little tool that works like a digital bloodhound, sniffing out locations by analysing street architecture, vegetation, and spatial relationships.
I had given it quite a few challenges, and it managed fairly well, with a few misses. But I couldn’t shake off an uneasy feeling, realising a potential for abuse of this tool. Seems like I wasn’t the only one.
The brainchild of Graylark Technologies, Geospy does not need GPS or metadata - and it can track down images faster than you can upload them. And until recently, this tool was accessible to anyone with internet access and curiosity.
After public exposure by 404 Media, Graylark quickly pivoted, restricting access to "qualified" entities. The report by 404 Media highlights how Graylark’s biggest clients are cops.
Read: The Powerful AI Tool That Cops (or Stalkers) Can Use to Geolocate Photos in Seconds | 404 Media
I get it, this tool would be a leap in law enforcement investigations, helping identify crime scenes to potentially tracking military movements. But as Copper Quintin, security researcher and senior public interest technologist at the Electronic Frontier Foundation, points out to 404 Media, a tool like this could pose a serious threat if used wrongly.
Quintin points out that if cops start deploying the tool at scale to build a geolocation database, or “gather evidence on people not engaged in suspected criminal activity”, it could lead to a myriad of problems like wrongful arrests.
Not to mention, this tool would be like steroids to surveillance systems, stalking and locating people based on the many photos they post of themselves, or of others. Journalists and activists fighting authoritarian governments would have it especially bad, as they are in the frontlines of mass surveillance.
The takeaway? Your selfies just became a lot more revealing. For activists, journalists, and anyone valuing privacy, it's time to rethink how and where we share our images.
We're hopeful Geospy will still continue to help us geolocate visuals for fact-checking, and not become another tool behind a massive paywall.
Talking of fact-checking, the other big challenge we’ve been consistently facing is deepfakes. And we've a new tool that could potentially help us with voice clones.
McAfee's Deepfake Detector
McAfee launched a deepfake detector in collaboration with AMD, targeting AI-generated video misinformation. The tool rapidly identifies AI-altered audio within videos using advanced Convolutional Neural Network models.
Operating directly on users' devices, it promises instant detection without collecting personal audio. However, there’s a limitation: this tool is primarily focused on detecting voice clones and other forms of audio manipulation, and does not address the growing threat of visual AI-generated content. Furthermore, researchers have warned that more sophisticated adversaries can bypass detection mechanisms through techniques like noise addition.
Have you been a victim of AI?
Have you been scammed by AI-generated videos or audio clips? Did you spot AI-generated nudes of yourself on the internet?
Decode is trying to document cases of abuse of AI, and would like to hear from you. If you are willing to share your experience, do reach out to us at [email protected]. Your privacy is important to us, and we shall preserve your anonymity.
About Decode and Deepfake Watch
Deepfake Watch is an initiative by Decode, dedicated to keeping you abreast of the latest developments in AI and its potential for misuse. Our goal is to foster an informed community capable of challenging digital deceptions and advocating for a transparent digital environment.
We invite you to join the conversation, share your experiences, and contribute to the collective effort to maintain the integrity of our digital landscape. Together, we can build a future where technology amplifies truth, not obscures it.
For inquiries, feedback, or contributions, reach out to us at [email protected].
🖤 Liked what you read? Give us a shoutout! 📢
↪️ Become A BOOM Member. Support Us!
↪️ Stop.Verify.Share - Use Our Tipline: 7700906588
↪️ Follow Our WhatsApp Channel
↪️ Join Our Community of TruthSeekers