YouTube Unveils Voice-Detection Tool Targeting AI-Powered ‘Synthetic Singing,’ Eyes 2025 Launch

YouTube is preparing to release a tool with which rightsholders can detect and manage videos containing unauthorized AI vocals. Photo Credit: Javier Miranda

YouTube is officially set to launch “synthetic-singing identification technology” that it says will allow rightsholders to detect and manage uploads containing unauthorized soundalike vocals.The video-sharing giant, having reportedly been in active AI discussions with the major labels for some time, disclosed the voice-detection tool in a brief update today. Penned by VP of product management for creator products Amjad Hanif, this update, aptly entitled “New tools to protect creators and artists,” has arrived on the heels of multiple other music-focused AI moves for YouTube.

July saw the Dream Track developer roll out an updated eraser feature that uses AI to remove copyrighted tracks without compromising videos’ audio quality, for instance. The same month also brought revamped YouTube privacy guidelines under which “uniquely identifiable” first parties can seek the removal of certain unauthorized AI soundalike and/or lookalike media.

And now, a vocals-detection capability appears poised to take YouTube’s AI-protection offerings a step further. Expected to debut with a pilot “early next year,” the tech will be available directly through Content ID, according to YouTube. While concrete details are few and far between at present, Hanif did spell out that eligible partners will be able “to automatically detect and manage AI-generated content…that simulates their singing voices.”

Especially because one needn’t search far on YouTube to find presumably unauthorized soundalike projects – referring both to original artificial intelligence tracks such as “Heart on My Sleeve” and an abundance of AI-powered covers – it’ll be interesting to see the tool’s effectiveness and ultimate impact.

Meanwhile, though it doesn’t really need saying, non-audio AI products (and particularly their video-generation counterparts) are evolving by leaps and bounds. Given the rapid improvements, it seems to be a matter of when, not if, eerily realistic AI lookalike media will explode in popularity.

Enter the second piece of AI-related tech announced today by YouTube, which says it’s developing a separate tool designed to help musicians and others “detect and manage AI-generated content showing their faces.” This option, Hanif noted, will complement the previously highlighted privacy-policy updates.

Looking ahead to 2025 and beyond, these increasingly robust AI-detection abilities could provide a means of controlling and monetizing artificial intelligence music.Adjacent to efforts to get a hold on unapproved AI works uploaded to YouTube, Instagram, TikTok, Spotify, and different high-profile platforms in the near term, the likes of Sony Music (which is developing a little-discussed proprietary AI product) and Universal Music (now partnered with content-attribution startup ProRata.ai) are also eyeing bigger-picture solutions.

But as things stand, the unprecedented technology is still at the center of high-stakes infringement litigation – with different effects yet stemming from the massive stream of non-infringing AI works that are reportedly flooding Spotify and more.

Reviews

0 %

User Score

0 ratings
Rate This

Leave your comment

Your email address will not be published. Required fields are marked *