The ELVIS Act: Setting the Stage for Policing Unauthorized Use of AI-Generated Sound and Likeness | Wilson Sonsini Goodrich & Rosati
Last month, Tennessee, birthplace of the “King of Rock and Roll,” broadened the state’s already robust right of publicity statute by passing the Ensuring Likeness Voice and Image Security Act (the so-called “ELVIS Act”). The ELVIS Act, which goes into effect on July 1, breaks new ground by specifically targeting generative artificial intelligence (AI) platforms and services that potentially enable the use of people’s likeness without permission. Other states, and potentially Congress, may soon follow Tennessee’s “opening act” in broadening right of publicity protections focused on uses of generative AI.
Protection of Voice
Replacing Tennessee’s existing right of publicity statute,1 the ELVIS Act explicitly adds voice to statutory protections for name and likeness. The ELVIS Act defines “voice” as “a sound in a medium that is readily identifiable and attributable to a particular individual, regardless of whether the sound contains the actual voice or a simulation of the voice of the individual.”2 Notably, the ELVIS Act’s protection extends beyond the commercial uses typically protected in other states’ right of publicity statutes by opening up potential liability for use of a person’s likeness outside of a commercial context. For example, deepfakes, even if not used commercially, could trigger liability.
Targeting Generative AI Tools
Generative AI garnered newfound public attention with AI-generated impersonations going viral, including purported new music by Drake and TikTok selfie videos of Tom Cruise. The ELVIS Act explicitly extends liability under state law to generative AI technology services that mimic a known individual’s persona. Section 6(a)(3) of the act creates liability where someone:
- “distributes, transmits, or otherwise makes available an algorithm, software, tool, or other technology, service, or device”;
- “the primary purpose or function” of which “is the production of a particular, identifiable individual’s photograph, voice, or likeness”; and
- with knowledge that the use was not authorized.3
While it remains to be seen how courts will interpret what constitutes a “primary purpose or function,” AI-enabled platforms should take preemptive steps to avoid potential liability. Companies offering generative AI tools for content creation will need to be proactive—for example, through the use of contractual terms prohibiting unauthorized use of another’s likeness and managing how such tools are marketed and used—in clearly demonstrating the tools’ designed, legally permissible purpose and function.
Post-Mortem Rights
The ELVIS Act maintains a post-mortem right, meaning the use of any aspect of likeness of deceased individuals could still be off-limits. Rights under the act are only terminated by proof of non-use for commercial purposes for a period of two years after the 10-year period following the individual’s death. Generative AI platforms capable of imitating long-deceased individuals should take note: post-mortem rights may last forever in some cases.
Narrower Defenses
The ELVIS Act narrows the fair-use exemptions that are provided under Tennessee’s current right of publicity statute. Although the act maintains exemptions for “fair use” of an individual’s voice or likeness for purposes of comment, criticism, scholarship, satire, or parody, it adds an explicit carve-out that an exemption only applies “[t]o the extent such use is protected by the First Amendment to the United States Constitution.” This could potentially increase the burden for defendants claiming fair use: the fair-use exemptions under the current law may be inapplicable unless a federal court has ruled the use is a type of protected speech.
The ELVIS Act also narrows possible defenses by attaching liability to advertisers that publish an advertisement or solicitation featuring cloned voices if the publisher “reasonably should have known” about the unauthorized use. In some cases, this means that generative AI companies could potentially be liable under the act if they are publishing advertisements—even if they don’t have actual knowledge of the unauthorized use.
Remedies and New Claimants
The remedies under the ELVIS Act include injunctive relief to prevent unauthorized use of the individual’s likeness or voice, destruction of materials found to be in violation of the act, and actual damages (as well as enhanced damages in some limited cases). Violation of the ELVIS Act can also constitute a criminal misdemeanor.
While the remedies are mostly the same as those provided by Tennessee’s current right of publicity law, one change is that record companies will now be able to seek relief under the ELVIS Act on behalf of a recording artist. This could lead to an increased number of right of publicity lawsuits in Tennessee, as the act enables third-party enforcement and no longer puts the burden to file suit squarely on the individual artist.
Setting the Stage for Similar Legislation
Other state and federal laws targeting voice cloning and AI technology platforms are in the works.
For example, California proposed legislation that would create liability for anyone (including technology platforms) that simulates the voice or likeness of a deceased celebrity via a “digital replica” without consent.4 Federal lawmakers have also introduced legislation targeting generative AI voice cloning, such as the Nurture Originals, Foster Art, and Keep Entertainment Safe (NO FAKES) Act, which would create liability when a platform, person, or company digitally replicates an individual’s “image, voice, or visual likeness” without authorization.5
Preemptive Considerations for Generative AI Companies
The broad scope of the ELVIS Act will likely lead to an uptick in right of publicity lawsuits with Tennessee as a preferred venue. Technology companies that incorporate AI-generated voice or a celebrity’s persona into their services should consider taking preemptive measures, particularly as other laws similar to the ELVIS Act are enacted.
Generative AI companies should have protocols in place to prevent activity that could fall within the ELVIS Act. If a platform allows third-party users to mimic any element of likeness, the platform owner should try to incorporate clear and conspicuous representations that users have all necessary rights. Generative AI companies should also consider limiting marketing and advertising that might be interpreted as passively supporting unauthorized creation of a third party’s persona. If a company is alerted to or otherwise develops knowledge of unauthorized use or replication of an individual’s likeness on its platform, the company should act swiftly to address the unauthorized use.
[1] TN Code § 47-25-1105 (2021),
[2] https://www.capitol.tn.gov/Bills/113/Bill/HB2091.pdf.
[3] https://www.capitol.tn.gov/Bills/113/Bill/HB2091.pdf.
[4] https://fastdemocracy.com/bill-search/ca/2023-2024/bills/CAB00030909/.
[5] https://deadline.com/wp-content/uploads/2023/10/no_fakes_act_draft_text.pdf.