Our latest Focus On guide explores the relationship between cinematography and sustainability. We speak to industry leaders about how to be sustainable on set and what changes need to be made going forward.

Virtual production



Home » Features » Tech-nique » Virtual production

Virtual production

AUTHOR:

A NEW REALITY

LED volumes are an exciting new tool in the cinematographer’s toolbox, mixing the real with the unreal. Four cinematographers who have already embraced the technology share their experiences. 

The term “virtual production” encompasses practices like pre-visualisation and remote location reccies, but its most exciting branch is in filming itself: the ability to shoot not on a location or physical set, nor on a blue or green screen, but against huge LED monitors displaying any background the filmmakers desire. At its simplest, these monitors may constitute a single wall. At its most advanced, a huge arc of screens, plus another acting as a ceiling, form a space known as a “volume”.

Content for the screens can be straightforward 2D footage or all-encompassing material from a 360-degree camera rig. The most groundbreaking set-ups use a video-gaming engine, typically Unreal from Epic Games, to render pre-built 3D backgrounds known as “loads” or “levels”. By tracking the motion of the camera within the volume, the computer system adjusts the parallax in the virtual backgrounds to give a convincing illusion of depth.

The ARRI / Creative Technology volume at ARRI Rental in Uxbridge

This technology has been famously showcased by the Disney+ series The Mandalorian, but for cinematographer and co-producer Greig Fraser ASC ACS the journey began on an earlier Star Wars project. While shooting the 2016 film Rogue One, Fraser found LED walls to be the best way to produce the interactive light of a planet outside a spaceship window. “That started the conversation,” he recalls.

Three years later, large chunks of The Mandalorian were filmed on a 75 x 20ft cylindrical volume, with Fraser involved from the get-go. “The volume is and should be designed by the production designer and the cinematographer,” he asserts. “The shape and the size of it we designed.”

LED walls are infinitely preferable to green screen in most scenarios for car work

Gavin Finney BSC

On this side of the pond, director and cinematographer Robert Payton has been consulting for ARRI and their partners Creative Technology as they develop the UK’s largest permanent volume. The keystone is a curved 30 x 5m surface comprising four 4K screens driven by an 8K processor. An adjustable 10 x 10m ceiling screen, an 18 x 4.2m rear screen and smaller, mobile side panels complete what Payton calls the “mixed reality” stage. “There’s been this misnomer of ‘virtual production’, because everything we’re gearing up to is mixed reality. It’s integrating foregrounds with backgrounds. It’s integrating our traditional craft as cinematographers with what’s come from a gaming environment.”

Many early adopters of the technology are using it for driving scenes. “LED walls are infinitely preferable to green screen in most scenarios for car work,” states Gavin Finney BSC, who has been testing and shooting for the Apple+ series Suspicion and C4/UNBC’s Darkness Rising. “You have true reflections in car bodywork rather than green spill and costly compositing. The interactive lighting is impossible to replicate on green screen with such accuracy, and the actors can see where they are going and react to their surroundings, steer around corners, brake… The list goes on.”

Actress Nange Agro starring in Tokyo Nights, shot in ARRI’s new mixed reality studio

For the Amazon series Chloe, shot by Catherine Goldschmidt, the choice to go virtual was fueled by logistics. “Our short shooting schedule didn’t allow enough time to go out and shoot [the driving scenes] practically,” she says, noting also the unsuitability of the chosen roads for a low loader. “On a single day in the studio we were able to shoot ten different scenes in different locations at different times of day with different vehicles.”

Treehouse Digital in Bristol provided Chloe’s virtual production services, employing a 300-degree volume of their own design. Backgrounds were shot by Jeff Brown of Brownian Motion, under the supervision of plates unit DP James Rhodes. “We used a nine-way configuration consisting of eight Red Helium S35s on the horizontal and a Monstro for the top sky plate because we used a full frame 220-degree fisheye,” says Brown.

The Mandalorian’s loads, although incorporating photographic elements, were primarily computer-generated. This meant that even location scouts were virtual, as Fraser recalls: “You sit there with your goggles and you bring out a [virtual] lens and you say to the director, ‘What about a 50mm here?’ And they stand next to you virtually and they say, ‘Great idea. What about a bit lower?’ And you go, ‘Yep, great.’ Then you work out what they need to build in 3D and what the art department needs to build in the real world.”

Pedro Pascal is the Mandalorian and Nick Nolte is Kuiil in The Mandalorian, exclusively on Disney+

Next comes virtual lighting, the broad strokes of which Fraser was able to do himself with the click of a mouse. “It’s the closest thing to playing God that a DP can ever do. You can move the sun wherever you want. You can go, ‘Let’s turn the sky so the sun actually rises between that window and this window.’” But he is quick to point out that “you’re not just lighting the 3D set; you’re lighting a 3D set that then lights a real set.”

“If you have large enough screens, almost all the ambient lighting is supplied by them in a very realistic way,” says Finney, “especially if you have a ceiling-mounted screen. I tested a 20m curved screen with a 6m ceiling and was able to reproduce a large slice of Trafalgar Square that three cameras could shoot on, and it looked completely real. You add additional lighting for faces, as you would on location.”

Payton agrees. “For me, when I walk onto the volume it’s a bit like going onto a location that’s lit by natural light. I almost go back to thinking like a documentary filmmaker: ‘Okay, how do I supplement this?’ One trick that we’re using is removing some panels that are out of vision and poking a hard light through there. As soon as you introduce some hard light it increases the realism.”

Gina Carano as Cara Dune in The Mandalorian

The Mandalorian’s harder light came from a bank of 50 Digital Sputnik DS1s. “It was broad, it was singular and it had a lighting quality that matched the world that we were in,” says Fraser, who also employed Creamsource fixtures on the volume. “I found that most other lights didn’t match [the screens] and there was a difference to my eye about the quality of light.”

For all its possibilities, virtual production technology is not without its limitations. Because of the vast processing power required to render a 3D load, only the section of the screens seen by the camera is displayed at full resolution; this area is known as the “frustum”. “There’s a bit of a delay, so if you pan too quickly, you start to see the edges of the frustum,” notes Fraser, who found the classical camera style of the Star Wars franchise well suited to this restriction.

With her pre-rendered driving plates, Goldschmidt reports successfully filming handheld on Chloe, adding, “We shot various shutter speeds and some 50fps work as well and we encountered no issues.” She attributes this to a genlock system which ensured cameras and screens were synchronised to record and play their frames at the exact same time.

Shooting test films on the ARRI / Creative Technology volume

“You have to be careful not to put the focus too near the screen, or you’ll get a moiré effect or see the individual LEDs,” Finney advises. “Keeping the screen 3m away from the subject and using a shallow stop of around T2 worked well, however, in one test I was able to focus just short of the screen as a focus pull to the background, and we got away with it.”

“The only time I saw some moiré pattern was in the reflection of the overhead screen in the windshield,” recalls Goldschmidt. “We were able to deal with this by placing some diffusion over the screen itself.”

Goldschmidt’s camera was the Sony Venice, while both Payton and Fraser favoured the ARRI Alexa LF. “Shooting in those volumes lends itself very much to large format sensors,” observes Payton, “just because you’ve got the fall-off in terms of depth of field.” He experimented with various diffusion filters, including 1/4 and 1/8 Hollywood Black Magics, finding they helped to bind foreground and background together.

Reflecting on the future of mixed reality, Payton says: “It’s not going to be a fad. It’s here to stay. It increases your creativity as a cinematographer, but it also increases control, and a controlled environment is a cost-effective environment because you don’t have any nasty surprises.”

“I love shooting on location,” says Goldschmidt, “and the happy accidents that can occur when shooting with natural light and weather. However, I do think virtual production can allow productions an extra level of scale and scope that they otherwise wouldn’t be able to afford.”

Greig Fraser ASC ACS

Fraser refers to a common problem of location shooting: “Okay, there’s a cloud coming and we can’t shoot for the next 15 minutes, therefore all the emotion we just built up over the last hour of the actors’ time we now have to put on hold because the light won’t match.” With virtual production this need not happen; the perfect light can be bottled and displayed on a volume all day. Fraser sums it up: “This technology allows filmmakers to stop time.”

Related Posts

Related Articles