Hollywood special effects teams are increasingly using metaverse-related technologies to bring fantasy worlds to life.
Industrial Light and Magic (ILM), the special effects firm founded by George Lucas in 1975, is among a host of firms pioneering ‘StageCraft’ technology to design and create virtual reality (VR) sets. The technology has roots in gaming design and the creation of virtual worlds.
The technology has been used in hit Disney+ shows such as The Mandalorian and Obi-Wan Kenobi, as well as big-budget feature movies including Matt Reeves’ The Batman, starring Robert Pattinson.
How it works
Stagecraft sets are similar to virtual reality headsets that render images in real-time. The key difference is that instead of rendering images on a headset viewer that is just a few inches across, virtual sets use hundreds of connected LED screens to create a c-shaped wall of virtual imagery 20 feet high, 75 feet across, and 270 degrees around.
Virtual sets are dynamic, able to shift and move as the camera’s point of view shifts. The camera becomes a VR headset and camera in one. The director can point the camera wherever they choose, capturing the real and virtual environment in a single shot.
“I’m able to put actors and cameras in this environment and we can see it and play in it and live in it,” explained Rick Famuyiwa, Executive Producer and Director of The Mandalorian.
ILM initially powered its virtual reality sets with the Unreal Engine – an off-the-shelf software solution for building game titles. The Unreal Engine powers Fortnite, Batman: Arkham City (Return To Arkham), Ark: Survival Evolved, and Conan Exiles, as well as virtual reality titles including Robo Recall, and The Walking Dead: Saints & Sinners.
ILM now uses a bespoke rendering engine, in keeping with their overall philosophy of developing technologies and processes in-house.
Too much of a good thing?
Hollywood is swiftly becoming a testing ground for metaverse technologies as the industry throws significant money at virtual stages.
Miles Perkins, industry manager of film and TV for Epic Games (Unreal Engine), recently spoke about the rapid proliferation of the technology.
“We are tracking roughly 300 stages, up from only three in 2019,” he told the Hollywood Reporter last month.
Such rapid adoption can create as many problems as it solves.
If industry insiders struggle to learn the language of VR and how to get the best from it, the result could negatively impact public perception in the longer term. Both The Book of Boba Fett and Obi-Wan Kenobi took criticism for their over-reliance on LED stages. The more recently produced series Andor has avoided the same trap with more on-location filming.
Director-producer Jay Holben remains optimistic about virtual sets even if the technology can catch unwary filmmakers off-guard.
“A lot of people are walking in thinking they can just turn on the camera and shoot,” Holben continued. “But if the light is not correct and the color balance isn’t set up properly – these things can look bad.”
Pixar also involved
Live-action studios are not the only Hollywood big hitters employing or developing metaverse technologies. Animation giant Pixar is also contributing to the field, albeit in a different arena.
In 2003 the studio developed Universal Scene Description (USD), a platform-agnostic software allowing animators in different studios and often using different software to communicate and collaborate effectively. The software was first used to create the hit movie Finding Nemo and continues to be used by the studio to this day.
In 2016 Pixar made USD open-source. Now USD is being touted as the HTML of 3D, with applications in visual effects, architecture, design, robotics, CAD and the metaverse.
Nvidia picks up the torch
Nvidia is one of the companies betting big on USD. It will employ the software within its Omniverse, a metaverse geared to corporate applications.
“USD should serve as the HTML of the metaverse”
According to Nvidia, “what will make the entire metaverse a success will be the same thing that has made the 2D web so successful: universal interoperability based on open standards and protocols.”
As Nvidia sees it that interoperability comes from USD. In August it stated outright that “NVIDIA believes that USD should serve as the HTML of the metaverse.”
To make Nvidia’s Omniverse dream a reality, USD will need to develop and change.
At present, there are a number of gaps in USD which limit its ability to fulfil its role as the HTML of the metaverse. The software does not even include full support in all international languages; a significant drawback for any system attempting to reach global appeal.
Neither is USD fast enough to deal with high-speed incremental updates. This functionality is the key to creating “digital twin” environments, recreating the real world in the virtual.
Another issue that Nvidia will need to solve is representing virtual worlds in-browser since not everyone will have access to VR headsets during the earliest days of the metaverse. Nvidia says it is working with partners to find solutions to these issues.
The future may be bright for USD and Nvidia’s Omniverse if these challenges can be met. There are already signs that other industries will find commercial applications for technology developed in Hollywood.
Already Ericsson, Kroger, and Volvo are employing USD to enable the construction of 3D worlds including workspaces and factories. Last year Nvidia collaborated with BMW to build a virtual factory of the future, finding cost and efficiency savings in the process.
From the movie screen to the metaverse and back to the real world, techniques developed for TV shows and movies look set to increasingly become part of our everyday lives, blurring the lines between fantasy and reality.