Intelectual Property (IP)

Witnesses Tell Senate IP Subcommittee They Must Get NO FAKES Act Right

“It will take very careful drafting to accomplish the bill’s goals without inadvertently chilling or even prohibiting constitutional uses of technology to enhance storytelling.” – Ben Sheffner, Motion Picture Association, Inc.

The U.S. Senate Judiciary Committee’s Subcommittee on Intellectual Property met today to hear from six witnesses about a recently-proposed bill to curb unauthorized uses via artificial intelligence (AI) of an individual’s voice and likeness.

The “Nurture Originals, Foster Art, and Keep Entertainment Safe Act of 2023” (NO FAKES Act) was introduced in October 2023 by Senator and Chair of the IP Subcommittee Chris Coons (D-DE) and Senators Marsha Blackburn (R-TN), Amy Klobuchar (D-MN), and Thom Tillis (R-NC). The goal of the bill is to “protect the voice and visual likenesses of individuals from unfair use through generative artificial intelligence (GAI).”

In his opening remarks, Coons said that, since the text of the discussion draft was released last year, they have heard from many stakeholders and held dozens of meetings to obtain feedback. The feedback has centered on five key areas to address in the bill:

  • Whether there should be a notice and takedown structure for removing illicit AI content;
  • whether the right balance has been struck in the text with the First Amendment;
  • whether the bill’s 70-year post mortem term should be adjusted or narrowed;
  • whether the Act should have preemptive effect over related state laws; and
  • whether it should create a process for individuals with limited resources and minimal damages to enforce their rights.

“I think the NO FAKES Act is unique because it touches everybody,” Tillis, who is the Subcommittee’s Ranking Member, said. Unlike other IP bills that apply just to patent holders or creators, the bill would confer a right to control one’s visual likeness on everyone.

In perhaps a nod to problems the Subcommittee has encountered with opposition to bills like PERA and PREVAIL in the past, Tillis also commented that he hopes everyone comes to the table to seek compromise on the bill, considering its importance. “The only thing that really makes me mad is when I see somebody trying to, through guerilla warfare, undermine the good faith efforts of this committee and my colleague,” Tillis said. “If you’re at the table you can have an influence; if you’re not at the table you’re gonna be on the table. And If you’re in the category of if ‘it ain’t broke don’t fix it,’ you’re not up with modern times.”

The six witnesses who testified today included an artist named Tahliah Debrett Barnett (“FKA twigs”). A singer, songwriter, producer, dancer, and actor, Twigs explained that she is both using AI to enhance her career and also being exploited by it.

On the one hand, said she has created an AI version on herself that she can use to speak in multiple languages in her own voice, which helps her to reach and connect with fans more effectively, and she said that AI also allows artists to “spend more time making art,” as simple one-liner press comments, for instance, can be delivered using AI. However, she has also had the experience of finding songs seemingly made by her online that she didn’t actually create or perform:

“It makes me feel vulnerable because as an artist I’m very precise. I’m very proud of my work and proud of the fact that my fans trust me because I put deep meaning into it. If legislation isn’t put in place to protect artists, not only will we let artists down who really care about what we do, but it also would mean that fans wouldn’t be able to trust people they’ve spent so many years investing in.”

The artist went on to say that it is hard to find the language to explain why AI needs to be regulated in this area because “it feels so painfully obvious to me” that artists need to have a right to control their own likeness. Tillis commented to laughter that a lot of issues Congress debates seem painfully obvious to everyone, so she’s not alone.

However, other witnesses, such as Ben Sheffner, Senior Vice President and Associate General Counsel, Law and Policy at the Motion Picture Association, Inc., said Congress needs to tread lightly lest they infringe on First Amendment Rights. While Sheffner said the NO FAKES Act is a thoughtful contribution to the debate about what guardrails should be in place, “legislating in this area necessarily involves doing something the First Amendment sharply limits—regulating the content of speech.” Thus, “it will take very careful drafting to accomplish the bill’s goals without inadvertently chilling or even prohibiting constitutional uses of technology to enhance storytelling,” Sheffner said.

As an example, Sheffner referenced the 1994 film, Forrest Gump, in which the digital technology of the time was used to create replicas of figures such as John F. Kennedy and Richard Nixon. Under the draft text, representations of such figures could require consent from their heirs or corporate successors, which would violate the First Amendment.

Sheffner said four key points need to be kept in mind when refining the bill:

  • Getting exemptions right is crucial or it will chill creativity;
  • the bill should preempt state laws that regulate the use of digital replicas in expressive works;
  • the scope of the right should focus on the replacement of performances by living performers only;
  • the definition of digital replica must be focused on highly realistic versions of individuals, not cartoon-like versions; and
  • Congress must stop to consider whether what it’s seeking to protect is already covered by other laws, like defamation, for example.

Lisa P. Ramsey, Professor of Law at the University of San Diego School of Law, also warned the Subcommittee about the First Amendment implications of the bill. Ramsey said the current draft of the NO FAKES Act imposes restrictions on the content of speech that don’t pass First Amendment muster. “When the Act applies to the use of digital replicas to impersonate people in fraudulent or misleading commercial speech it is consistent with the First Amendment,” Ramsey said. “The problem is that the current version of the Act also regulates non-misleading speech. It must be narrowly tailored to directly and materially further its goals and not harm speech protected by the First Amendment more than necessary.”

As drafted, Ramsey said the bill is not consistent with First Amendment principles because it’s “overbroad and vague,” but that a revised version could withstand scrutiny. She suggested three key improvements:

  1. While she said the bill does a better job than the No AI Fraud Act in setting forth specific exceptions for the First Amendment, it’s important to include a strict rule for online service providers that says specific and actual knowledge of use of unauthorized replicas should be required for any liability. Online service providers should also implement a notice and takedown system to make it easier to remove unauthorized deepfakes, which should be able to be challenged via counter-notification.
  2. Congress should create separate causes of action that target different harms. The use of deepfakes to impersonate individuals in a deceptive manner, uses of sexually explicit deepfakes, and uses that substitute for an individual’s performance that they typically would have created in real life should all have different requirements and distinct speech- protective exceptions.
  3. Each provision should adequately protect speech interests and preempt state laws on right of publicity and digital replica rights. Individuals should be able to consent to each licensing use of digital replicas, as allowing others to control an individual’s image through broad licensing agreements would work at cross-purposes with the Act.

On the topic of service provider liability, Graham Davies, President and Chief Executive Officer of the Digital Media Association, said liability should be focused on the creator and those first releasing the content. “Streaming services don’t have any way to know the complex chain of rights,” Davies said. He also said that new legislation should be developed from the existing right of publicity laws, rather than IP law, since there’s an existing body of case law.

Duncan Crabtree-Ireland, National Executive Director and Chief Negotiator of the Screen Actors Guild-American Federal of Television and Radio Arts, said he likes the fact that the bill provides for broader protection than just commercial uses, but said the current language doesn’t sufficiently limit the term of transfer or licensing an individual’s likeness. This could be a problem for a young arist, for instance. “I’d adopt a durational limitation on transfers and licenses during lifetime,” Crabtree-Ireland said. “It’s essential to make sure someone doesn’t improvidently grant a transfer of rights early in life that turns out to be unfair to them.”

He added that, while SAG-AFTRA members are strong advocates for the First Amendment, the Supreme Court has made it clear that the First Amendment does not require that speech of the press be privileged over protection of the individual being protected and that balancing tests should determine which right will prevail.

Robert Kyncl, Chief Executive Officer of Warner Music Group, agreed, explaining that First Amendment concerns over responsible AI are misguided. “AI can make you say things you didn’t say or don’t believe; that’s not freedom of speech,” Kyncl said.

He added that attribution through watermarking in order to determine provenance will be crucial and urged the Subcommittee to take the time to get it right. “We are in a unique moment of time where we can still act and we can get it right before it gets out of hand—the genie’s not yet out of the bottle, but it will be soon.”

Twigs, who is represented by Warner Music Group, reiterated the importance of giving artists back control:

“Ultimately, what it boils down to is my spirit, my artist[ry] and my brand is my brand and I’ve spent years developing it and it’s mine—it doesn’t belong to anybody else to be used in a commercial sense or a cultural sense, or even just for a laugh. I am me, I am a human being, and we have to protect that.”

 

Story originally seen here

Editorial Staff

The American Legal Journal Provides The Latest Legal News From Across The Country To Our Readership Of Attorneys And Other Legal Professionals. Our Mission Is To Keep Our Legal Professionals Up-To-Date, And Well Informed, So They Can Operate At Their Highest Levels.

The American Legal Journal Favicon

Leave a Reply