U.S. Senators Target Unauthorized AI Soundalike Tracks With Bipartisan ‘No Fakes Act’

  • Save

no fakes act

  • Save

The Senate side of the U.S. Capitol. Photo Credit: Scrumshus

With artificial intelligence music – including a growing number of unauthorized soundalike releases – becoming increasingly prevalent, four U.S. senators have introduced a bill that they say would “protect the voice and visual likenesses of individuals from unfair use.”

Senators Marsha Blackburn (R-TN), Chris Coons (D-DE), Amy Klobuchar (D-MN), and Thom Tillis (R-NC) formally announced the bipartisan legislation, dubbed the No Fakes Act, today. To this point in 2023, more than a few industry organizations and companies have publicly addressed the need for a cohesive regulatory approach to AI’s rapidly expanding role in the creative space.

Meanwhile, amid litigation over the protected works upon which generative models are trained, AI-powered soundalike songs have for some time been making waves. Third parties have released the vast majority of the projects without permission from the appropriate professionals, and despite being booted from leading music platforms such as Spotify, the tracks are continuing to rack up millions of views/plays.

Enter the No Fakes Act (full title “Nurture Originals, Foster Art, and Keep Entertainment Safe Act”), which the aforementioned lawmakers claim will establish “the right to authorize the use of the image, voice, or visual likeness of the individual in a digital replica” and, in the process, help stop the distribution of AI soundalike works absent clear-cut consent.

According to a discussion draft, persons including “sound recording artists” (as well as their heirs/representatives when applicable) would under the No Fakes Act be able to pursue legal action against defendants who create “digital replicas” sans authorization.

Also potentially on the hook for these unauthorized usages are those who facilitate the “publication, distribution, or transmission of, or (are) otherwise making available to the public, an unauthorized digital replica, if the person engaging in that activity has knowledge that the digital replica was not authorized.”

As described in the draft, “digital replica” refers to any computer-generated “image, voice, or visual likeness of an individual” that’s “nearly indistinguishable” from the original at hand and is featured in a recording or video in which the mimicked party didn’t actually appear.

Notably, the text further outlines an array of exceptions to the proposed law, including for certain digital replicas used in “a news, public affairs, or sports broadcast or report,” “a documentary, docudrama, or historical or biographical work,” or adverts for the works.

Exceptions would likewise exist for replicas used for the “purposes of comment, criticism, scholarship, satire, or parody” as well as instances wherein usages are “de minimis or incidental,” the document shows.

Predictably, individuals whose “image, voice, or visual likeness” rights were allegedly violated by an AI-made digital replica, on top of those who own allegedly violated rights, would be able to bring civil actions under the legislation. Plus, as written in the draft, record labels would too have the option of suing for alleged soundalike/lookalike efforts.

Regarding “a digital replica involving a sound recording artist,” the draft spells out, “any person that has entered into a contract for the exclusive personal services of the sound recording artist as a sound recording artist” could sue.

In any event, plaintiffs seeking relief under the bill would be entitled to (among other things) the larger of “$5,000 per violation” – of course, the definition and extent of “violation” will prove rather important – or “any damages suffered” due to the alleged violation(s).

The American Association of Independent Music (A2IM) reached out to Digital Music News with a statement about the No Fakes Act, touting the proposal as a meaningful step in the right direction.

“Independent record labels and the artists they work with are excited about the promise of AI to transform how music is made and how consumers enjoy art, but there must be guardrails to ensure that artists can make a living and that labels can recoup their investments,” A2IM president and CEO Richard James Burgess said in part.

“Generative AI could be the next step in a downward spiral of devaluing music on the Internet that started with digital piracy and has continued with underpayment by streaming services and social media platforms.

“The time is now to do away with the state patchwork of laws on the rights of creators to fight against fakes, and we are committed to working with the Senators and all stakeholders to get this right.

“Our experience with the DMCA and other enforcement tools is that real enforcement can become elusive and prohibitively expensive for small creators. We will be laser-focused on making sure that any federal solutions on digital replicas are accessible to small labels and working-class musicians, not just the megastars,” he concluded.