- Deepfake Watch
- Posts
- Deepfake Watch 21 (copy 01)
Deepfake Watch 21 (copy 01)
AI Music Startups Face The Music | Deepfake Watch
Browser View | June 28, 2024 | Subscribe
AI companies being hit with copyright lawsuits is something you may have gotten used to by now. While the usefulness of AI-powered tools is not something people would disagree on, the issue of training data is a different matter altogether.
With industry leaders OpenAI and Microsoft facing copyright issues over how they trained their large language models, companies behind AI music generators are now facing the music.
Opt-in to receive the newsletter every Friday.
The case Of Suno and Udio
Two leading US-based AI music startups Suno and Udio are facing lawsuits by the Recording Industry Association of America (RIAA) on behalf of some of the biggest record labels, for unauthorised use of copyrighted music to train their AI music generators.
Warner Music Group, Universal Media Group and Sony Group, among others, claim that the music generated by these tools “imitate the qualities of genuine human sound recordings,” and have sued the companies for US$150,000 per song.
The lawsuit follows an open letter by 200 artists, including Billie Eilish, Pearl Jam and Nicki Minaj, calling to protect the rights of human artists.
Both Suno and Udio have put out statements highlighting efforts to prevent the imitation of copyrighted music.
“Generative AI models, including our music model, learn from examples. Just as students listen to music and study scores, our model has ‘listened’ to and learned from a large collection of recorded music,” stated Udio, in response to the lawsuit.
Suno fired back aggressively, stating, “We would have been happy to explain this to the corporate record labels that filed this lawsuit (and in fact, we tried to do so), but instead of entertaining a good faith discussion, they’ve reverted to their old lawyer-led playbook.”
However, neither companies have made any comments on their training datasets, which they claim to be proprietary.
Meet GetReal Labs - the deepfake hunter
Media forensics company GetReal Labs announced their launch yesterday.
The company has developed tools to detect images, video and audio created or manipulated using either AI or old-school editing techniques. Its tools can also analyse content in real time, and help detect deepfakes in an ongoing video call.
Fact-checkers and researchers from around the world have been grappling with more and more sophisticated deepfakes lately, and have been increasingly dependent on technology to scrutinise such content, as manual methods become insufficient.
The company, cofounded by media forensics expert Hany Farid, has been incubated by cybersecurity venture capital firm Ballistic Ventures over the past two years, and has also raised investments from Venrock and Ballistic Ventures.
Latest in deepfakes and AI
Have you been a victim of AI?
Have you been scammed by AI-generated videos or audio clips? Did you spot AI-generated nudes of yourself on the internet?
Decode is trying to document cases of abuse of AI, and would like to hear from you. If you are willing to share your experience, do reach out to us at [email protected]. Your privacy is important to us, and we shall preserve your anonymity.
About Decode and Deepfake Watch
Deepfake Watch is an initiative by Decode, dedicated to keeping you abreast of the latest developments in AI and its potential for misuse. Our goal is to foster an informed community capable of challenging digital deceptions and advocating for a transparent digital environment.
↗️ Was this forwarded to you? Subscribe Now
We invite you to join the conversation, share your experiences, and contribute to the collective effort to maintain the integrity of our digital landscape. Together, we can build a future where technology amplifies truth, not obscures it.
For inquiries, feedback, or contributions, reach out to us at [email protected].
🖤 Liked what you read? Give us a shoutout! 📢
↪️ Become A BOOM Member. Support Us!↪️ Stop.Verify.Share - Use Our Tipline: 7700906588
↪️ Follow Our WhatsApp Channel↪️ Join Our Community of TruthSeekers
Copyright (C) " target="_blank">unsubscribe