After 45 years of voicing one of the most iconic characters in cinema history, James Earl Jones has said goodbye to Darth Vader. At 91, the legendary actor recently told Disney he was “looking into winding down this particular character.”
That forced the company to ask itself how do you even replace James Earl Jones? The answer Disney eventually settled on, with the actor’s consent, involved an AI program.
MORE FROM RAVZGADGET: NVIDIA’s Latest AI Model Generates Objects And Characters For Virtual Worlds
If you’ve seen any of the recent Star Wars shows, you’ve heard the work of Respeecher. It’s a Ukrainian startup that uses archival recordings and a “proprietary AI algorithm” to create new dialogue featuring the voices of “performers from long ago.”
In the case of Jones, the company worked with Lucasfilm to recreate his voice as it had sounded when film audiences first heard Darth Vader in 1977.
According to , Jones had signed off on Disney using recordings of his voice and Respeecher’s software to “keep Vader alive.”
Lucasfilm veteran Matthew Wood told the outlet that James guided the Sith Lord’s performance in Obi-Wan Kenobi, acting as “a benevolent godfather,” but it was ultimately the AI that gave Vader his voice in many of the scenes.
MORE FROM RAVZGADGET: NASA And Hideo Kojima Team Up For A Ludens-Inspired Watch
While there’s something to be said about preserving Vader’s voice, Disney’s decision to use an AI to do so is likely to add fuel to disagreements over how such technology should be used in creative fields.
For instance, Getty Images recently art over . With Jones, there’s the possibility we could hear him voice Vader long after he passes away.
Reading your article helped me a lot and I agree with you. But I still have some doubts, can you clarify for me? I’ll keep an eye out for your answers.