On September 10, minutes after the first presidential debate between Donald Trump and Vice President Kamala Harris, an Instagram post set the political world abuzz: Taylor Swift endorsed Harris in the race. The announcement from one of the world’s biggest stars was newsworthy in itself, but IP lawyers likely took note of why she chose to post. Swift explained that she felt compelled to share her views after a photo featuring an AI-generated image of her appearing to endorse Donald Trump was posted online. The image was shared by Trump himself on his social media platform Truth Social, and was circulated widely by his supporters. Swift wrote that the image “really conjured up my fears around AI, and the dangers of spreading misinformation.”

Swift isn’t alone in her concerns. Hyper-realistic digital replicas of individuals or their voices, sometimes known as “deepfakes,” have become so prevalent that Congress may step in to regulate them. A bill known as the Nurture Originals, Foster Art, and Keep Entertainment Safe (NO FAKES) Act was recently proposed to directly address the issue of AI-generated images, videos, and sounds meant to impersonate individuals without their authorization. The bipartisan Act was introduced in the Senate on July 31, and a companion bill was introduced in the House on September 12. While it remains to be seen if it will actually become law, its introduction has sparked meaningful conversation about the importance of a nationwide solution to this problem.

While the NO FAKES Act is primarily a response to the relatively new issue of AI-generated content, it represents a significant step in resolving a long-standing issue: the lack of a uniform federal right of publicity. The right of publicity is the exclusive right of an individual to control the use of their name, image and likeness for commercial purposes. Currently, this right only exists at the state level, and protections vary significantly from state to state. Depending on the jurisdiction, the right may or may not survive an individual’s death, may protect only certain elements of a person’s identity or likeness, and may differ depending on the individual’s specific characteristics. In some states, the right has not been recognized at all.

While several states have passed laws aiming to protect publicity rights from AI-generated content specifically, they tend not to be as comprehensive as the proposed federal legislation. For example, California recently enacted three new laws requiring digital watermarking of AI-generated content, criminalizing creation and distribution of sexually explicit deepfake content, and mandating creation of mechanisms for reporting such content on social media. These are important developments in the law, but are less far-reaching than the proposed federal legislation, which would ban the production and distribution of virtually all unauthorized deepfakes.

The NO FAKES Act would establish a national right of publicity for all individuals, regardless of the commercial value of their identifies, both before and after death – at least with respect to any “newly-created, computer-generated, highly realistic electronic representation that is readily identifiable as the voice or visual likeness of an individual[.]” It would make such rights unassignable during the lifetime of an individual, and regulates how they may be used and assigned after death. It establishes civil penalties for violations, and includes other elements meant to harmonize the current patchwork of state laws.

However, the legislation as currently drafted contains significant exceptions. First, it exempts from preemption any state law regulating digital replicas in place before January 2, 2025, meaning that variations among jurisdictions will not be completely eliminated. It additionally keeps in effect any state laws regulating sexually explicit or election-related deepfakes, regardless of when they were passed. It also contains broad fair use carve-outs to exempt First Amendment-protected uses such as “a bona fide news, public affairs, or sports broadcast or account, provided that the digital replica is the subject of, or is materially relevant to, the subject of such broadcast or account,”  and allows uses of replicas that are “fleeting or negligible.” These exceptions could raise challenging questions about when it’s permissible to use digital replicas, and how their regulation should be balanced with free speech rights. Courts would be left to interpret some of their more vague aspects, such as when the use of a deepfake is sufficiently “fleeting” to be protected.

Attorneys representing users of generative AI or individuals with commercially valuable identities should familiarize themselves with their jurisdiction’s right of publicity laws, and keep a close watch on the progression of the NO FAKES Act. While creation of a national right of publicity would be a significant step in the regulation of AI, technology has often proven to outpace the law. As AI and related technologies develop, it is more important than ever to think creatively about how to use the law to prevent the misappropriation and falsehoods it can perpetuate.

Print:
Email this postTweet this postLike this postShare this post on LinkedIn
Photo of Nicole O. Swanson Nicole O. Swanson

Nicole Swanson is an associate in the Litigation Department.

Nicole earned her J.D. from New York University School of Law, where she served as a Managing Editor of the Moot Court Board and was elected to the Order of Barristers. While at NYU…

Nicole Swanson is an associate in the Litigation Department.

Nicole earned her J.D. from New York University School of Law, where she served as a Managing Editor of the Moot Court Board and was elected to the Order of Barristers. While at NYU, Nicole externed with the Civil Division of the U.S. Attorney’s Office for the Southern District of New York.

Prior to law school, Nicole served as an AmeriCorps volunteer in Phoenix, Arizona, working with self-represented litigants in family court.

Nicole maintains an active pro bono practice. She volunteers with LIFT (Legal Information for Families Today) to provide family law consults, and serves as a member of LIFT’s junior board. She also supports the New York State Courts’ Pandemic Practices Working Group in its efforts to evaluate court policies adopted in response to COVID-19.

Photo of David Munkittrick David Munkittrick

David Munkittrick is a litigator and trial attorney. His practice focuses on complex and large-scale antitrust, copyright and entertainment matters in all forms of dispute resolution and litigation, from complaint through appeal.

David has been involved in some of the most significant antitrust…

David Munkittrick is a litigator and trial attorney. His practice focuses on complex and large-scale antitrust, copyright and entertainment matters in all forms of dispute resolution and litigation, from complaint through appeal.

David has been involved in some of the most significant antitrust matters over the past few years, obtaining favorable results for Fortune 500 companies and other clients in bench and jury trials involving price discrimination and group boycott claims. His practice includes the full range of antitrust matters and disputes: from class actions to competitor suits and merger review. David advises antitrust clients in a range of industries, including entertainment, automotive, pharmaceutical, healthcare, agriculture, hospitality, financial services, and sports.

David also advises music, publishing, medical device, sports, and technology clients in navigating complex copyright issues and compliance. He has represented some of the most recognized names in entertainment, including Sony Music Entertainment, Lady Gaga, U2, Madonna, Daft Punk, RCA Records, BMG Music Publishing, Live Nation, the National Academy of Recording Arts and Sciences, Universal Music Group and Warner/Chappell.

David maintains an active pro bono practice, supporting clients in the arts and in immigration proceedings. He has been repeatedly recognized as Empire State Counsel by the New York State Bar Association for his pro bono service, and is a recipient of Proskauer’s Golden Gavel Award for excellence in pro bono work.

When not practicing law, David spends time practicing piano. He recently made his Carnegie Hall debut at Weill Recital Hall with a piano trio and accompanying a Schubert lieder.

David frequently speaks on antitrust and copyright issues, and has authored or co-authored numerous articles and treatise chapters, including:

  • Causation and Remoteness, the U.S. Perspective, in GCR Private Litigation Guide.
  • Data Breach Litigation Involving Consumer Class Actions, in Proskauer on Privacy: A Guide to Privacy and Data Security Law in the Information Age.
  • Location Privacy: Technology and the Law, in Proskauer on Privacy: A Guide to Privacy and Data Security Law in the Information Age.
  • FTC Enforcement of Privacy, in Proskauer on Privacy: A Guide to Privacy and Data Security Law in the Information Age.
  • The Role of Experts in Music Copyright Cases, Intellectual Property Magazine.
  • Nonprofit Education: A Historical Basis for Tax Exemption in the Arts, 21 NYSBA Ent., Arts, & Sports L.J. 67
  • A Founding Father of Modern Music Education: The Thought and Philosophy of Karl W. Gehrkens, Journal of Historical Research in Music Education
  • Jackson Family Wines, Inc. v. Diageo North America, Inc. Represented Diageo in trademark infringement litigation