Seeing the light

Seeing the light

Decisions about light and colour made before the virtual production shoot begins can have a big impact on a production’s success.  

Notwithstanding the efforts of DITs and playback specialists, the final finishing process is often the first point in the life of a production at which the images shot on an LED stage are seen in a proper viewing environment. That’s likely to be the first time things are subject to a really accurate assessment, and the success of the work might often depends on decisions about light and colour made a long time before the shoot even commenced. 

The final finishing process is the stage at which senior colourist Siggy Ferstl, who has graded countless commercials, features and TV episodes, starts the colour grading process. Ferstl’s CV includes commercials for Nike, Mercedes Benz and Cadillac, as well as The Boys for Amazon Prime and Wednesday for Netflix. His recent experience with virtual production suggests that the devil is in the details. “It’s a new technology, and there’s always an excitement about new technology. People want to use it and test it and push its limits. And for the best results, you really have to test it and get everyone involved, especially your colourist. It’s only when you’re pushing the contrast around on a high-end monitor in the correct viewing conditions that you will really see some of the issues.” 

With a wide variety of technology involved, Ferstl finds that just as wide a variety of concerns can arise. “It’s a range of issues – noise, aliasing, jagged edges. There can be contrast ratio issues where the blacks just aren’t deep enough, and the peak whites aren’t intense enough. If you’re talking about the blacks in a very moody situation, for instance, the foreground might well have deeper blacks than the LED panels used on set are capable of. Sometimes, in the grade, I’ll generate a depth map in Resolve or use a Magic Mask to separate the foreground and background, to add a little touch of contrast to the background imagery but fixes can become more complicated.”  

The results of shooting on a VP stage can be mixed: “There are times when it works well and you put it up, balance it out and think ‘wow this is great’. You can see the potential in it. There are shots or scenes that just look stunning. But you’re constantly checking what you’re looking at from a technical perspective. There are little things you sometimes don’t see that still need to be addressed.” 

As is so often the case in production, prevention in pre-production is better than cure at almost any later date, so Ferstl always pushes his clients to try their approach up front. “If there’s any way to test – shoot dark scenes, bright scenes, with wide angle lenses, over- and undercranking, different shutter angles – before principal photography, I strongly recommend it. There’s always this stage of learning, where are the limits? How far can we push this? That’s where testing should be a big part of preproduction.” 

In such a young field, it’s easy to overlook the importance of feeding back this kind of information to technical people who are on set and in a position to make best use of it. As Ferstl points out, though, the scramble to meet a release date may not make it easy to schedule those meetings. “[Often] they would, but I’m sure there are shows where a lot of the information that we learn in post doesn’t make it back. Everyone’s moved on, everyone’s just dealing with the issue at hand and might not be communicating as efficiently as we might like. That sometimes means I’m pulling out a lot of tricks and spending a lot of time having to tweak shots from scenes filmed in front of LED panels.” 

Creative possibilities 

Cinematographic best practices for virtual production, as Ferstl explains, need not be complicated. It often involves the same approaches to the original photography that have been selling effects for decades. “Tighter shots with a longer lens tend to work well,” he points out. “I also think it’s important to keep the camera moving a little, even if it’s subtle, just a dolly or a track or something like that, to get a bit of a perspective shift going. It can really help make an effect look real. Integration of foreground and background, like snow or rain falling in the foreground, can help integrate the two elements.” 

Where that sort of thinking works best is often in collaboration with the sort of post-production finesse Ferstl finds himself applying when the illusion doesn’t quite hold up. “Things like torchlight – with really high bright whites,” he says, “often don’t have enough intensity, but in post we can isolate just those portions of the shot and punch them up or apply a soft glow to make them look more realistic.” Another technique that can help sell the illusion is through integrated lighting. “Something like torchlight, where you have flickering flames on the LED panel, might not register in the foreground. In a situation like that, I’ve sampled the flicker rate within my colour corrector and used that information to generate my own flicker in the grade where I can key in specific areas of the image which would be affected by the flames, so it appears the torches are lighting the whole area. I’ve used those sorts of tricks frequently to help integrate and make scenes look more realistic. There is a lot the colourist can do to help but it’s always best to figure these things out earlier rather than later.” 

That sort of interactive lighting has been a powerful technique for effects integration ever since people first found themselves waving lighting devices around by hand in an effort to simulate motion through an environment. Tim Kang is principal engineer for Imaging Applications at Aputure and chair of the Lighting Committee at the ASC’s Motion Imaging Technology Council, where the use of video data to directly control lighting is a large part of his brief. This sort of image-based lighting is, he says, not a new idea, especially in the context of lighting computer-generated worlds to match reality. “Image-based lighting has been around for twenty years, and image-based workflows are available to drive lighting fixtures right now. It’s the crashing of industries together, the confluence of VFX technique and another industry, which is lighting. It’s the combination of virtual and real life.” 

Turning those lofty ideas into practical filmmaking tools, Kang goes on, is the subject of active development. “Even up until the release of all the Quasar stuff, and the release of the Kino-Flo Mimik, it was still a science experiment. You needed to have either Unreal Engine, a media server like disguise or Pixera to drive this sort of stuff. You needed to have a very high-level professional to do all this. But tools are going to keep coming out to make it more straightforward and blur the lines between what people already know and what virtual production is really about.” 

In such a rarefied field, task-specific tools are rare, but Kang cites one as key: “Assimilate Live FX. You’re taking something as simple as a Photoshop type of live interface and control, with Resolve-type colour management and tools, and After Effects level of compositing and live video effects, and a video switcher. You’re able to take that powerful package and drive pictures with it in an interface that doesn’t require a high-level technician with a PhD or masters’ level of understanding. Any technician in the film business who has dealt with this stuff can drive it.” 

Standard approach 

If there’s a mountain left to climb in this field, it’s standardisation. Many devices currently speak one of a few different protocols, often derived from legacy technologies originally built for live events. There is also no guarantee that most lights will emit the same colour even when they are sent the same data. “It’s colour and communication,” Kang confirms. “There needs to be an agreed-upon way to communicate and understand video colour. There are infrastructures that exist, but they’re not fully understood and exploited. They were designed to facilitate flash-and-trash rock and roll. Eye-popping, video-driven pixel mapping has been around for twenty years in the event world, but it never understood the video data correctly to make it a photography technique.” 

As so often, there are any number of potential solutions; the problem is not so much implementing any one of them as securing the agreement of all the interested parties. “Lighting manufacturers need to catch up in understanding video language,” Kang points out. “How do we attach a good spectrum to that colour space? How do I get my light to react like a monitor in terms of responsiveness and create a good result on camera. The ASC Lighting Committee has been working toward that so manufacturers know what to do. And it’s not just for video; it improves everyone’s product lines regardless.” 

With that in mind, Kang points out that the recent storm of interest in VP has perhaps lent two things to the world. “It made rear projection better. 90 percent of the chatter has been about honestly better rear projection techniques, it’s just making that proscenium more realistic behind a person… and it introduces image-based lighting. This is a philosophical shift. It’s not a scary thing; it’s exciting, because technology is making lighting more literal and creative and fun. Once you get the technical stuff out of the way it’s opening up so many more things.” 

Words: Phil Rhodes 

Partners

More Online Articles from Focus On: Virtual Production II

Virtual freedom

Powering change

Production Profile / Ahsoka

Lighting in the volume 

Virtual Production Vol.2 cover

Focus on Virtual Production II

As VP techniques and technology continue to progress at an incredible pace there’s plenty to explore in Focus On: Virtual Production II from British Cinematographer.

More Online Articles from the Supplement

Issue 126

Buy a subscription - still the only way to see the full British Cinematographer magazine!

Print & digital from £64 Print from £40 Digital from £30