Jared Harris's voice is hosting a podcast he never agreed to record. Films Not Made, a new show from producer Amy Hobby, used AI to deepfake his voice for episodes about unmade Hollywood projects. According to Deadline, Harris has instructed his lawyers to intervene—which would be straightforward if this were a studio production covered by the SAG-AFTRA contract. It's not. It's an independent podcast, operating in the vast regulatory gap where last year's hard-won union protections don't apply.
The 2023 SAG-AFTRA strike ended with contract language requiring studios to obtain consent before using AI to replicate a performer's likeness or voice. That was a meaningful victory. It set a precedent. But it only governs work produced under union contracts. Films Not Made exists outside that system entirely—different infrastructure, different revenue model, different legal framework. Harris can call his lawyers, but whether they can actually stop the podcast depends on which state's right-of-publicity law applies, how courts interpret statutes written decades before generative AI existed, and whether anyone involved has deep enough pockets to make litigation worthwhile.
This is where the real fight begins. The SAG-AFTRA deal protected actors working within traditional Hollywood. It didn't—and couldn't—regulate the entire universe of people who now have access to voice-cloning tools accurate enough to fool casual listeners. The podcast economy, the creator economy, the entire sprawl of independent media production: none of it falls under union jurisdiction. Harris's case is the first major test of whether actors' likeness rights can extend beyond the union footprint, or whether the protections won last year only matter inside the studio system.
Right-of-publicity law in the U.S. is a patchwork. California and New York have robust protections. Other states barely recognize the concept. Almost none were written with AI in mind. The question isn't whether Harris has been wronged—he clearly has. The question is whether existing law gives him a remedy, or whether this will require new legislation to close gaps that are becoming more obvious every month. The SAG-AFTRA contract bought time. It didn't solve the problem.
What makes this case particularly instructive is that Films Not Made isn't trying to deceive anyone. It's not a scam robocall or a fake endorsement. It's a creative project that presumably thought using a deepfaked voice was a legitimate production choice, maybe even an homage. That's the cultural shift actors are up against: a generation of creators raised on remix culture, who see AI tools as just another plugin, and who may not understand why consent matters when the output is clearly synthetic.
This mirrors the tension playing out across creative industries. Fashion is experimenting with AI-generated models, music producers are cloning vocal styles, and visual artists are watching their work get fed into training datasets without compensation. The common thread: tools that were once expensive and required specialized skill are now accessible to anyone with a laptop and a subscription. The scale and accessibility of that capability is what makes this different from past fights over image rights.
The entertainment industry has dealt with unauthorized use of likeness before—look-alikes, impressionists, manipulated footage. Those cases required human skill and left visible seams. AI collapses that barrier. A podcast producer with a modest budget can now generate a performance that sounds identical to a professional actor's work. The friction that once protected performers—the cost, the expertise, the detectability—is gone.
Harris called this "a perfect example of concern creatives have over misuse of AI." He's right, but the concern runs deeper than misuse. It's the assumption that use without consent is even on the table. The fact that this podcast exists suggests that some creators still see AI-generated performances as fair game, either because they don't know better or because they're betting that enforcement will be too slow, too expensive, or too legally ambiguous to stop them. The technology moved faster than the cultural norms, and the law is still catching up.
The real tension here is between what's technically possible, what's culturally acceptable, and what's actually legal. AI voice synthesis is now trivially easy. Cultural norms around its use are still being negotiated. And the law is, at best, playing catch-up. Actors forced studios to agree to contractual guardrails. Now they need those guardrails to become statutory, and they need them to apply everywhere—not just inside the union bubble.

That's a harder fight. It requires state-by-state legislation, or a federal right-of-publicity statute that doesn't currently exist. It requires persuading lawmakers that this isn't just a celebrity grievance, but a broader question about who controls digital identity at a moment when anyone's voice, face, or likeness can be synthesized without their participation. It requires moving faster than the technology, which is not something legal systems are known for.
If Harris's lawyers can establish that this kind of use is actionable under existing law, it sets a precedent that could give other actors leverage. If they can't, it confirms what many in the industry already suspect: that the SAG-AFTRA contract was a starting point, not a solution. The union won protections for work produced under its jurisdiction. Everything else—the podcasts, the YouTube channels, the independent productions that now make up a significant portion of media output—remains unregulated.
The Harris case will likely be messy, slow, and inconclusive. But it's forcing a conversation that the industry can't avoid anymore. The tools exist. The creators using them aren't going away. And the legal framework that's supposed to protect performers was built for a world where replicating someone's voice required their physical presence. That world is gone. What replaces it will be determined by cases like this one—fought not in the clean confines of union negotiations, but in the chaotic, inconsistent patchwork of state courts and outdated statutes.