How Video-Game Engines Help Create Visual Effects on Movie Sets in Real Time

Solo A Star Wars Story BTS
Courtesy of John Wilson /Lucasfilm Ltd.

Donald Glover was blown away. “This is the coolest thing I have ever done,” he could be heard muttering into a hot mic after he had put the Millennium Falcon into hyperdrive for the first time on the set of “Solo: A Star Wars Story.”

What impressed Glover so much was that the scene wasn’t filmed in front of a green screen, as is typically the case with movies that rely heavily on visual effects. Instead, Lucasfilm’s Industrial Light & Magic unit had built an elaborate setup of five 4K laser projectors around the Falcon’s cockpit, which displayed the iconic hyperdrive animation in real time. The setup not only allowed Glover and his fellow actors to perform in less of a vacuum, but the projectors were also used as the sole source of lighting — resulting in stunning reflections of the flashing blue lights in the actors’ eyes.

The hyperdrive-jump scene is just one example of a new production paradigm that has become a growing part of Lucasfilm’s “Star Wars” movies. Instead of adding visual effects in post-production, the studio is relying more on real-time technologies. And Lucasfilm isn’t alone in the approach: From major movie studios to independent producers, everyone is increasingly embracing real-time production tools that change how movies and TV shows are made — and enable projects that might not otherwise have existed. 

Popular on Variety

Over the past few years, ILM has been developing a suite of virtual production tools that embraces a range of real-time technologies. Dubbed Stagecraft, these tools encompass the entire production process, from early set design with the help of VR headsets to visual effects like the ones used for “Solo.”

What unites many of the tools is that they deliver results instantly that previously would have taken hours, or even days, explains ILM head and executive creative director Rob Bredow. “Real-time is a fundamental change to the workflow,” he says. “Visual effects and digital techniques are being included much, much earlier in the process.”

One example: Virtual sets that previously would have been added to a film weeks later can now be previewed in real time, as shots are being framed. ILM used the technology for another scene in “Solo” — the train heist sequence that combined actors performing in front of a green screen with footage shot in the Italian Alps. “You can get a sense of how this is actually coming together,” says Bredow. “This is a game changer in terms of the kind of creative choices you can make.”

ILM’s journey to embrace real-time technologies began in earnest with Steven Spielberg’s “A.I.” some 20 years ago, when it helped the director to make Rogue City, the film’s glitzy and sexualized take on Las Vegas, come to life on set. “Obviously, the city was big and not practical to build,” recalls ILM chief creative officer John Knoll. Instead, ILM built a dedicated tracking system for the onstage camera that allowed computers to add a preview of the virtual set in real time. This made it possible for Spielberg to compose shots in front of a blue screen and at the same time preview them with a virtual rendition of the entire city.

To do this, ILM used a game engine — the software at the core of modern, graphics-rich video games that renders imagery on the fly to account for the unpredictable movements of a video-game player. “That was one of the very first times that a game engine had been used for live previz on a set,” Bredow explains of the previsualization process, which allows filmmakers to see what effects-driven scenes in a film will look like before they are shot.

Initially developed as a kind of underlying plumbing for video games, game engines have increasingly become a favorite tool for filmmakers looking to add real-time visuals to parts of their production process.

At first, the use of game engines in Hollywood was mostly limited to the kind of pre-visualization pioneered by Spielberg and ILM. But as graphics-processing chipsets optimized for this type of real-time computing become more powerful, game engines are playing a bigger role across the entire workflow, down to what insiders call the final pixel — images that look so good they can actually be shown in theaters or on TV.

“Real-time gives us the ability to put together better approximations of the final shots earlier. And when you can do that, then all the efforts can go into making that shot look as good as possible.”
Rob Bredow, ILM

Nickelodeon last year announced a show with the working title “Meet the Voxels” that will be produced entirely with a game engine. Disney Television Animation released a series of shorts called “Baymax Dreams” in September that were produced in a similar fashion. And Lucasfilm sneaked a droid called K-2SO that was rendered with a game engine into “Rogue One: A Star Wars Story,” where it was virtually indistinguishable from traditionally rendered characters.

“Everybody’s realizing that the day of epic render farms and waiting 16 hours to see what the water looks like in your shot is over,” says Isabelle Riva, who heads Made With Unity, an arm of the game-engine developer Unity Technologies that promotes the use of its software in Hollywood and beyond. “Wasting your time is over,” she says. 

The immediacy of real-time production tools, and the ability to respond to requests from filmmakers much more quickly, is a big reason why ILM is embracing them. “Visual effects are really all about iteration time,” explains ILM PR director Greg Grusby. “The more iterations you can get in front of a director, the quicker you’re gonna get to the final goal.”

Adds Bredow: “Real-time gives us the ability to put together better approximations of the final shots earlier. And when you can do that, then all the efforts can go into making that shot look as good as possible. Get all the subtle things down that are going to make the character breathe correctly and his clothes look perfect and the lighting be perfect. Once you know that the general idea of the shot is working well, you can work on those finesse pieces.”

Aside from saving time in the production process, real-time also holds the promise of being a lot cheaper than traditional production technologies. This opens up opportunities for a new crop of filmmakers to produce Hollywood-like fare at much lower budgets, especially when it comes to animation. A great example of this is “Sonder,” which was crowned best animated short at last year’s Los Angeles Independent Film Festival.

“Sonder” director Neth Nom had worked on a couple of video games, as well as some virtual reality projects for companies like Google and Baobab Studios. After experiencing the power of game engines at those jobs, he decided to rely on the technology to produce his film. “I saw the potential for it to save a lot of time in production,” he says.

Using a game engine to make a movie — particularly finding the right people for the job — wasn’t always easy. At the beginning of the project, Nom went to a number of Unity meet-ups, which he likens to speed-dating networks for developers, complete with the letdown of ending up without a date at the end of the night. “Once a month I would try to recruit Unity engineers, but nobody was interested,” he recalls. “They were all just hardcore gamers.”

Ultimately, Nom and “Sonder” producer Sara Sampson found a crew of 240 people, who all worked on the short as a labor of love, often collaborating remotely and changing things on the fly while reviewing scenes via Google Hangouts video conferences. “A lot of our crew members are in different countries,” says Nom. “This kind of challenges the idea, ‘Do you really need a studio?’ You could do this in your living room.”

“Great stories can come from anywhere,” agrees Riva. Game engines, she argues, can
help independent creators do much of what studios have done for years with expensive tools and huge render farms — high-performance computer clusters specifically built to produce visual effects, generally for film and TV. “You don’t need the whole suite that most movie studios have.”

But big studios aren’t ready to throw out their existing production technologies just yet. ILM, for instance, has been focusing on interoperability among its various tools. It still relies on traditional software but in some instances leans on a customized version of Unreal Engine, the game engine from Epic, developer of the massively popular title “Fortnite.” Additionally, ILM has a proprietary real-time engine dubbed Helios, based on technology developed at Pixar.

“Five or six years ago, when we started this era of making ‘Star Wars’ films, it was very clear that we would have to reuse a lot of assets,” explains Bredow. That’s why ILM developed a way to take the same assets and make them work in any of these toolsets. Quips Grusby, “Build once, deploy anywhere.”

Actor Brendan Byrd performs as a raptor during character testing of the Magic Mirror real-time performance system.
Chris Hawkinson/Industrial Light & Magic

The approach also helped ILM when it was working on “Rogue One.” Knoll recalls looking at some of the work Electronics Arts had been doing with its “Star Wars” games, and realizing that EA already had built some assets for its games that ILM needed for the movie, including the iconic AT-ST Walker, the Imperial vehicle that fans of the franchise first got to see in “The Empire Strikes Back.” “We pulled a couple of assets over, and because the real-time tools are as good as they are now, it didn’t take much for us to be able to use them in a feature film,” he says.

As game engines become more popular, this type of exchange is becoming a two-way street, allowing studios to reuse film assets in video games, VR experiences and more. “The Universals, the Paramounts, the Foxes, the Warner Bros. — they’re not only looking at VFX films,” says Riva. “They’re looking at everything. Consumer products, rides, games, films, everything.”

And as studios embrace game engines to make movies, they gain access to a new generation of artists, explains Bredow. “We are definitely putting together film artists with people who have more games and real-time technology backgrounds,” he says. “We can recruit from a broader pool of people with different experiences.”

Ultimately, the speed at which Hollywood is embracing real-time tech depends on how comfortable filmmakers are working with colleagues coming from the video-game world, and the production tools they bring with them. Some directors may be more married to existing workflows, but others are embracing the new world of real-time wholeheartedly.

Bredow remembers ILM visual effects supervisor Grady Cofer demonstrating one of ILM’s first real-time tools for Spielberg in September 2015: specifically, ILM’s Stagecraft Magic Mirror, a customized motion-capture tool that allows actors to observe themselves as visual-effects characters in real time on a large LED screen, just as if they were looking into a mirror.

Spielberg was working on “Ready Player One” at the time, and Cofer’s plan that day was to just show the director what the technology was capable of. However, Spielberg immediately grabbed a virtual camera and started framing shots to let “Ready Player One” characters like At3mis and Parzival come to life on the Magic Mirror’s screen.

“It turned from a technology demo to a creative brainstorming session with him and his actors,” Bredow recalls. “That’s when we get really excited, when the technology kind of disappears and it’s just back to filmmaking. Then we know we’ve hit the mark.”