An AI program voiced Darth Vader in ‘Obi-Wan Kenobi’ so James Earl Jones could finally retire

A startup called Respeecher recreated the actor's voice as it was in 1977.


After 45 years of voicing one of the most iconic characters in cinema history, James Earl Jones has said goodbye to Darth Vader. At 91, the legendary actor recently told Disney he was “looking into winding down this particular character.” That forced the company to ask itself how do you even replace Jones? The answer Disney eventually settled on, with the actor’s consent, involved an AI program.

If you’ve seen any of the recent Star Wars shows, you’ve heard the work of Respeecher. It’s a Ukrainian startup that uses archival recordings and a “proprietary AI algorithm” to create new dialogue featuring the voices of “performers from long ago.” In the case of Jones, the company worked with Lucasfilm to recreate his voice as it had sounded when film audiences first heard Darth Vader in 1977.

According to Vanity Fair, Jones had signed off on Disney using recordings of his voice and Respeecher’s software to “keep Vader alive.” Lucasfilm veteran Matthew Wood told the outlet that James guided the Sith Lord’s performance in Obi-Wan Kenobi, acting as “a benevolent godfather,” but it was ultimately the AI that gave Vader his voice in many of the scenes.

While there’s something to be said about preserving Vader’s voice, Disney’s decision to use an AI to do so is likely to add fuel to disagreements over how such technology should be used in creative fields. For instance, Getty Images recently banned AI-generated art over copyright concerns. With Jones, there's the possibility we could hear him voice Vader long after he passes away.