Protecting Musicians From Artificial Intelligence Misuse

By Rachel Stilwell, Esq.
Tools For Protecting Artists From Unauthorized Use of Artificial Intelligence in Music.

Artificial Intelligence has been a hot-button topic within the entertainment industry for well over a year. This article proposes methods to empower creative professionals to protect their intellectual property in the present and future, including novel approaches not included in current legislative efforts.

COPYRIGHT LAW

Artificial Intelligence jolted the music industry in 2023 upon the release of sound recordings created by scraping vocals of artists like Drake, the Weeknd and Bad Bunny, without their consent. Such unauthorized sound recordings capitalized on the scraping and AI training to generate new recordings mimicking the voices of those famous artists without their consent.

In these cases, the artists’ record labels chose to squash the unauthorized distribution of this music on the basis that the use of their artists' recorded voices constituted copyright infringement.

Record labels traditionally own the copyrights in sound recordings made by artists who have entered into exclusive recording agreements (although some artists like Taylor Swift have more recently used their leverage to demand that they retain ownership of their masters going forward). Record labels that own copyrights in sound recordings have, by virtue of that ownership, legal standing to make copyright infringement claims when AI users generate voice-cloned sound recordings and distribute them without the label’s consent.

Unauthorized AI-generated tracks put words (lyrics) in artists’ mouths that they neither wrote nor agreed to sing. This also entails copying musical compositions created by songwriters; if such copying is done without the consent of those songwriters or their publishers, then that's copyright infringement. (For those of you who want to learn more about purported "Fair Use" defenses with respect to the use of musical compositions, check out my prior blog about Randi Zuckerberg's stupid and cringeworthy infringement of Dee Snider's "We're Not Gonna Take It" here).

Some argue that current copyright laws do not go far enough to protect creators of copyrightable works from misuse of generative artificial intelligence. We are interested in a bill recently introduced into the House of Representatives by Congressman Adam Schiff of California: the "Generative AI Copyright Disclosure Act of 2024." This bill, if enacted, would 1). Require a notice be submitted to the Register of Copyrights prior to the public release of a new generative AI system with a sufficiently detailed summary of all copyrighted woks used in building or altering the training dataset for that system; 2). Require the Copyright Office to establish a publicly available online database of notices filed; 3). Require the Copyright Office to issue regulations to implement these requirements and assess civil penalties for failure to comply; and 4). Apply retroactivity to generative AI systems already available to consumers. We like the policy behind this bill and are curious in particular about how, if the bill becomes law, civil penalties would actually be applied (and by whom). While time is running out in the current congressional session (2023 - 2024) for this bill to become law, we look forward to seeing how another version of this bill may be reintroduced next year.

STATE LAWS GOVERNING RIGHTS OF PUBLICITY: IT'S TIME TO MAKE THESE PROTECTIONS FEDERAL

Those artists who do not own the copyright in their sound recordings have little recourse under federal copyright law in the event of such unauthorized AI-generated sound recordings. Some states in the U.S. have laws prohibiting the misappropriation and distribution of artists’ names, likenesses, and in a select handful of states, voices. Those laws govern what is known as “rights of publicity.” For example, California has a set of statutes, Cal. Civ. Code 3344 and 3344.1, prohibiting the misappropriation and distribution of name, likeness, voice, signature or photograph without such person’s prior consent.

While only about half of the states in the U.S. have laws governing rights of publicity, recently proposed federal protections against AI-generated digital replicas of performers would prohibit such misappropriation of persona on a federal, nationwide basis. A newly proposed, bipartisan bill, the “NO FAKES ACT” was recently released as a discussion draft by Senators Coons (D-Del.), Blackburn (R-Tenn.), Klobuchar (D-Minn.) and Tillis (R-N.C.). This bill would create a new right of publicity that could be enforced in all states and territories in the U.S. As an attorney who regularly represents creative professionals (recording artists, songwriters, actors, etc.) whose rights of publicity have earned value that should belong to those individuals to have and potentially license to others at their own discretion, I believe that the NO FAKES ACT should be strongly supported. Furthermore, I will continue to volunteer my time and expertise to ensure its passage through advocacy.

That said, even when bipartisan, legislation often moves slowly and there’s no guarantee of its passage. So, in addition to relentless advocacy on the NO FAKES ACT we must advocate to further strengthen the bill’s language to remove narrow exceptions that allow so-called “parody and satire.” The concepts of parody and satire are notoriously difficult to define when litigating those issues. Any legislation should, like copyright law does, place the burden of proving the applicability of parody and satire on the infringer rather than the victim of infringement.

In addition to this advocacy on an amended version of the NO FAKES ACT, we also need to adopt concurrent strategies to fix the insufficient patchwork of state laws governing the rights of publicity. Foremost, individual states that do not already have a statute protecting individuals’ rights of publicity should enact such laws. I am in active discussion with state lawmakers outside of California about introducing such state legislation in early 2024.

RACHEL SUGGESTS A NOVEL APPROACH: AMEND & ENFORCE ANTI-BOOTLEGGING STATUTES?

Yet another concurrent approach that I propose, and which should not be viewed as a substitute for any of the approaches above, is to amend the already existing federal law that protects against the unauthorized recording -- and unauthorized distribution -- of a musical performance. An amended version of this seldom-enforced federal statute could be used to protect singers and instrumentalists from unauthorized distribution of their recordings through AI generation, even if those musicians do not own the copyrights in their recordings due to contractual obligations. Note that while some state statutes prohibit the misappropriation of a human voice, those laws do not extend to the sounds of instrumentalists. But instrumentalists deserve protection from misappropriation of their performances, too. Just in case the much broader NO FAKES ACT either doesn't pass (or takes many years to pass), amending an already-existing anti-bootlegging law that more narrowly applies only to recorded musical performances might be easier to enact into law.

This federal civil anti-bootlegging statute (17 U.S.C. § 1101) imposes civil penalties for the unauthorized recording of live performances or the transmission or distribution of such. This is true even if the bootlegging is not done for commercial gain. The statute provides that anyone who engages in these prohibited acts is potentially liable for monetary damages. A court may also seize applicable unauthorized recordings.

Similarly, 18 U.S.C. § 2319A makes bootlegging a criminal offense if the perpetrator — without the consent of the applicable artist — knowingly records a live musical performance or distributes that recording for commercial gain.

Anti-bootlegging laws have rarely been enforced, but, as I wrote in Billboard a few years back, a federal law specifically prohibits people from recording musical performances without the applicable performer’s consent. Accordingly, musicians performing in the U.S. have a legal right to prohibit unauthorized recording of live musical performances, regardless of whether those people who do the unauthorized recording intend to distribute their video or audio recordings.

The current federal anti-bootlegging laws have two distinct ways of being enforced. Most members of the entertainment business consider the term “bootlegging” to mean the recording of live musical performances without the performers’ consent. Such practices are already prohibited. But current law also prohibits unauthorized distribution of recordings of live performances. Suppose that an artist tells a fan that the fan may record the artist's live performance for the fan's own personal enjoyment, but the artist also asks that fan to refrain from uploading the recording to public platforms. Under this circumstance, if the fan who made the live recording later released that recording publicly, he/she would have violated this federal law.

Current civil and criminal anti-bootlegging statutes prohibit unauthorized recording and distribution of LIVE musical performances. So, while these laws protect against unauthorized distribution of concert footage, they do not currently prohibit unauthorized distribution of studio recordings. For this reason, I do not believe that the current anti-bootlegging statutes go far enough to protect artists in the current age of AI. To make these laws more protective of artists, I propose that Congress amend these anti-bootlegging statutes to simply remove the word “live” from the governing language. If this amendment were to occur, then artists who make studio recordings (thereby consenting to the MAKING of such studio recording) would be protected against the unauthorized DISTRIBUTION of their studio recordings, whether in an AI context or otherwise. Artists would then be afforded such protections even if they didn’t own the copyrights in their own master recordings.

One last note on how Congress should amend the criminal and civil anti-bootlegging statutes: Some academicians have argued that these laws are unconstitutional because these protections against “bootlegging” have no end date. Theoretically, such protections could last forever under current law. Compare copyright law, which provides an “end date” to protections against copyright infringement. Currently, the term of federal copyright in the U.S. is measured by the life of the applicable author plus 70 years. Although at least one California court has ruled that the civil federal anti-bootlegging statute is not unconstitutional, I think that if Congress were to amend the statute to extend its coverage to studio recordings, it might as well also amend the statute by inserting a similar end date that limits protections to the same time period as specified by copyright law. If this occurs, artists would be far more protected than they are now, regardless of whether they own their own master recordings and regardless of whether they are singers or instrumentalists.

If you want to stay connected and learn more about this topic, I encourage you to contact me at rachel@rmslawoffices.com.

Rachel Stilwell, Esq. is Owner of Stilwell Law, a boutique law firm with expertise in entertainment, intellectual property, media, and employment law. The views expressed in the article are the author's and should not be attributed to any organizations with which Rachel is affiliated.

menu