Artificial intelligence can now clone a voice from a short audio sample, generate photorealistic video of a person who never appeared on camera and even produce endorsements that an athlete or performer never gave. The technology is advancing faster than the law, and the gap between what AI can do and what the law clearly prohibits is wide enough to create meaningful risk for athletes, entertainers and the professionals who represent them.

Until recently, an athlete’s name, image and likeness (NIL) rights were one of the primary legal tools for protecting personal identity from unauthorized commercial use. Those rights were developed for a world where misappropriation often meant placing someone’s face on a billboard without permission. Generative AI has changed that equation and most traditional right of publicity frameworks were not built to address it.

What the States Are Doing

Against that backdrop, states have begun to act. Tennessee was among the first states to directly address the issue. Its Ensuring Likeness Voice and Image Security Act (ELVIS Act), which took effect in July 2024, is one of the first laws in the country to expressly address AI-generated voice replication within a right of publicity framework. Critically, the ELVIS Act reaches not only those who create unauthorized deepfakes but may also extend to developers and distributors of the AI tools used to make them. Violations can result in civil liability and criminal penalties.

California followed with two significant laws, both effective Jan. 1, 2025. AB 2602 protects living performers from boilerplate contract provisions allowing studios or brands to use an AI replica of their voice or likeness in place of work they would otherwise have performed in person. Such provisions are unenforceable unless the performer had legal representation and the contract clearly describes how the digital replica will be used. Generalized language may no longer be sufficient without specific disclosure of those uses.

AB 1836 requires explicit estate consent before an AI-generated replica of a deceased athlete or performer can be used commercially. New York, Illinois, Texas and Utah have enacted or are actively pursuing similar protections, and Washington state signed legislation into law in March 2026 adding “forged digital likenesses” to its existing personality rights framework, effective June 10, 2026.

The Federal Picture

Congress has not yet passed a national standard. The NO FAKES Act has been introduced in Congress with bipartisan support but has not been enacted. Recent enforcement actions, including the first federal conviction involving AI-generated imagery under the TAKE IT DOWN Act, underscore that these risks are no longer theoretical. Until it does, an athlete or entertainer’s protections depend heavily on where they live, where their deals are made and where the offending content is distributed, creating a fragmented landscape where rights may exist in theory but prove difficult to enforce in practice.

What To Do Now

For athletes, entertainers and their representatives, several practical steps are worth considering. Review existing contracts for broad digital rights grants, particularly those signed before recent California laws took effect. Consider trademark registration for names, nicknames, logos or signature phrases, which can provide a separate and effective tool against unauthorized AI-generated commercial use.

In new deal negotiations, treat digital replica provisions as a meaningful point of negotiation, not boilerplate, and insist on specificity around authorized uses, context, duration and compensation. For estates managing the legacy of a deceased athlete or performer, digital likeness rights are increasingly important to address proactively.

Parsons Behle & Latimer regularly advises athletes, entertainers, and their representatives on protecting personal brand assets, negotiating agreements, and navigating emerging issues in intellectual property law. Reach out to learn more at https://parsonsbehle.com/capabilities/sports-entertainment-and-media

Capabilities