A new way of working

A new way of working

Directors, DPs, and VP supervisors discuss the workflows, techniques, caveats and benefits of LED volume shooting and the wider world of virtual production.

If you’re shooting in an LED volume, what dictates the lighting?” asks director-DP Brett Danton. “If it’s the background, which has already been done, somebody else is dictating where the lighting is coming from. Yes, you can change it on the day, but it’s not as fast as you think.”

Danton, who has tested and shot with various virtual production technologies for media company WPP, has found that CG artists sometimes lack appropriate lighting skills. “They build the scene beautifully, but their lighting isn’t the way I’d do it in the real world… It’s almost lit like a game.”

Eben Bolter BSC had a similar experience with the virtual backgrounds for nighttime forest scenes in Percival, a short film. He wanted to create a single-source moonlit look but found it difficult to convince the 3D team. “In the computer game world, you can get that floating studio around characters where they’re always perfectly lit,” he comments. “I really didn’t want to do that. I wanted to keep it grounded a bit more.”

Eben Bolter BSC shot Percival on the Capitol stage at Rebellion’s studio in Oxfordshire 

Working during lockdown, Bolter went through an iterative process with his Percival collaborators to get the look he sought. “We would have a meeting and look at what they’d done in 3D on Zoom, and then just give notes,” he explains. “Then a couple of days later they’d come back with a new draft, and we’d give notes again.”

Bolter’s level of input is rare, according to Phil Galler, virtual production supervisor and CTO of NEP Virtual Studios. “The reality is that DPs are not involved,” he says. “More often than not it’s a visual effects supervisor who’s dictating the lighting loads prior to getting to the stage.” He believes this is a hold-over from traditional workflows in which “visual effects has owned the process”, doing their work after a DP has left the project. “I would like a DP involved from the beginning because I think their lensing choices would help with the previz,” he adds.

“The best solution is us as a DP or cinematographer being able to light it,” affirms Danton. He typically receives assets which he can load into NVIDIA Omniverse, a 3D simulator and collaboration tool. “The scenes are pre-built in other packages… and all I’m doing is dropping lights into it.”

SEEKING VALIDATION

Once the backgrounds are finished and lit, the VP department conducts a step called validation, in which the virtual assets are brought into the game engine, checked, and tested on the LED walls. “We’ll try to line up the shots, and make sure the angles work,” says Julia Lou, a virtual production supervisor at Lux Machina. “If we need to adjust for a certain shot, a lot of times we’ll move an asset or an object, but it’ll just be for that [one] set-up.” A variant of the virtual scene with that adjusted object can be saved separately and recalled on the day of filming.

“Colour’s always a very interesting topic in LED volumes!” laughs Lou. “We do a lot of testing with DPs and with colour scientists to figure out the calibration of the LEDs to the camera. We try to keep everything coming out of the LEDs as linear and neutral as possible, so that creative control over how the grade looks is with the camera and DIT, so that we’re not baking it into the content and affecting the lighting.”

One BSC member I spoke to felt that he had not been given this creative control, after shooting nighttime driving scenes for a few days in an LED volume. He asked not to be named as the production is still under an NDA. “Upfront I had been assured that I would not have to adjust the camera to suit the wall, but almost straight away that was what I was asked to do,” he told me. He found the slightest light pollution on the screens turned the blacks to milky greys and could not relight or alter his LUT without harming the foreground.

Brett Danton has found that CG artists can lack appropriate lighting skills for some virtual production projects 

Dale Elena McCready BSC NZCS has one solution to this problem: “I generally find that if you flash the lens then you lift the blacks on your foreground subject [to match the background]. Then you can grade them both back down.” Her set-up involves shooting through a diffusion filter – “a little bit of diffusion’s going to help anyway, because it’s going to bleed the background over the foreground” – and shining a small LED source into that filter – “I used a phosphor strip.”

McCready participated in the Virtual Production Innovation Project, a joint initiative by Sky Studios, Dimension, and DNEG, filmed on ARRI’s mixed reality stage. Collaborating with consultant and DP Robert Payton, she lensed six different scenes, clips from which can be seen on director Paul Franklin’s Vimeo channel. “Very old-school camera techniques to make this high-tech thing believable are really interesting,” says McCready, who embraced the chance to experiment. This included sticking tiny pieces of clingfilm to a filter as localised diffusion, blurring the line between real foreground and CG background. “You are responding to something you see in front of you, and you can tweak the image to make it feel right, whereas with a lot of [conventional] visual effects photography you don’t really know what’s going to happen to that image that you’ve created.”

CHALLENGES REMAIN

But for some, no matter what camera tricks are employed, virtual production is not yet up to scratch. “It’s potentially a pretty exciting technology that will enable some interesting projects to be achieved, but it’s not quite ready,” one ACO member told me, “and it’s certainly not the universal panacea it’s sometimes presented as.”

“I don’t think it’s ready,” Bolter agrees. He lists various problems with LED volumes including limitations on blocking, low screen resolutions which force the DP to a very shallow depth of field, moiré and banding, and latency. The latter Bolter found particularly restrictive on Percival, preventing him from doing handheld shots and requiring the editor to cut out the start and end of any camera moves so that the lag in the background image would not be apparent. He is confident that the problems will be solved, however. “We do need people to keep using it so that it progresses… I’m definitely a long-term fan of it. I’m just cautious at the moment.”

Other DPs, including McCready, have had success with all kinds of camera movements, proving that not all VP systems are created equal. According to Galler, sometimes fast camera moves will need to be rotoscoped in post to correct latency in the backgrounds, but the LED volume approach is still worthwhile because of lighting, reflections, and immersion for the actors. “What works well is what you tested well,” he says, going on to list some of the more dynamic camera moves he has seen succeed without post correction: “I’ve seen drones… I’ve seen some flying-cam work getting finalled, I’ve seen a ton of [Sony VENICE] Rialto work crammed into cockpits getting finalled. Jib work, crane work, dolly work, Steadicam… I think even well-orchestrated fast-moving pieces can work well in a volume if they’re done properly.”

ARRI’s mixed reality stage was the host of the Virtual Production Innovation Project 

“The biggest deficiency in the system that I came across is that it’s not standardised,” McCready says. “We have an HDR element of material coming from Unreal and that should map from zero to whatever bright peak is, onto the screens, so all the values should sit in a certain place. We were constantly having to juggle by eye, going, ‘That should be brighter, so please pump it up a bit more,’ and, ‘Add a bit more contrast.’”

“The main complaint we get is that people look kind of sunburnt when they’re being lit from the volume,” Lou remarks, “so then DPs or VFX try to compensate by taking magenta out of the content or cooling down the grade of what’s on the LEDs. Some shows have come up with a specific LUT for volume shots, on top of what they use for their regular show LUT.”

“This is not a silver bullet,” Galler points out. “It’s a tool, and we should be viewing it as a tool… There’s nuance to using a tool properly, and a lot of mastery that’s required to make the most out of it, and also a lot of compromise.”

Brett Danton has also been involved with using photogrammetry and LiDAR scanning to create virtual environments from real ones 

MOVEMENT, PHILOSOPHY, PRACTICE

While LED volume shooting may be the most talked-about type of virtual production, there is far more that falls under the VP umbrella. “Virtual production is a movement and a philosophy and a practice. It’s not just one thing,” says DP Asa Bailey, also a studio virtual production supervisor at Netflix. “It’s the idea of being able to produce content with technology to get over the problems of distance, to create in the Cloud, to create in camera.

“Over this last month I’ve shot in four different countries,” continues Bailey, who did not leave his studio in North Wales. “There’s audience instead. Produced using Unreal Engine, the virtual reality film is based around tracks from Bastille’s album “Give Me The Future”. “The fans were streamed in from around the world,” Danton explains, their movement data being rendered as humanoid clusters of light particles. “We shot the band in front of an LED volume wall and the people at obviously green implications for what we’re doing, and convenience.”

As part of the Virtual Production Innovation Project, Dale Elena McCready BSC NZCS lensed six different scenes on ARRI’s mixed reality stage

Using a low-latency, high-speed internet connection, Bailey can shoot on stages anywhere in the world, or indeed in the virtual world. “I have a grip set-up; I have a dolly, track; I have jib arms; I have a camera connected to them. All of that motion activity is captured, fed into the system and then replicated either virtually, within the engine, or transmitted to remote hardware… In a way, it’s the performance capture of the camera crew.”

A recent project of Danton’s – “basically a concert in the metaverse” – performance-captured an home were actually interacting on the wall in real time.” The 2D footage filmed in the volume was later projected onto virtual screens in fantasy 3D environments through which the viewer moves in the final VR experience.

Danton has also been involved with using photogrammetry and LiDAR scanning to create virtual environments from real ones and believes that something similar will happen with filmmaking equipment. “When you buy a physical set of lenses you might have the option of buying the virtual set of lenses as well. And that I know is happening with lighting manufacturers as well, so you’ll buy a physical light and that’ll come with the virtual light.”

Dale Elena McCready BSC NZCS was part of the Virtual Production Innovation Project, a joint initiative by Sky Studios, Dimension and DNEG, and shot on ARRI’s mixed reality stage 

It seems that the real and virtual worlds will overlap more and more as time goes on, with new workflows for cinematographers to learn, but many new creative possibilities to embrace as well.

Words by Neil Oseman

Partners

More Online Articles from Focus On: Virtual Production

Empowering virtual talent

Immersed in the future

Filming the future

Innovate to elevate

Virtual Production cover

Focus on Virtual Production

In the first of British Cinematographer’s new ‘Focus On’ guides, we’re kicking proceedings by shining a light on virtual production (VP) – a cutting-edge filmmaking tool creating a buzz in the industry.

Explore selected articles online, or access the full guide as an online publication.

More Online Articles from the Supplement

Issue 122

Buy a subscription - still the only way to see the full British Cinematographer magazine!

Print & digital from £64 Print from £40 Digital from £30