Despite being a relatively recent addition to the cinematographer’s toolkit, virtual production technology has already earned a legion of devotees thanks to its expansive filmmaking possibilities.
Virtual production (VP) combines virtual, augmented and physical reality with CGI, made possible by real-time game-engine technology. Exemplified by Hollywood productions such as The Mandalorian, the methodology takes traditional screen media production to a new dimension to which cinematography needs to adapt too. DP James Medcraft calls it “designing theatre for the lens.”
“You’ve got to think of shooting VP as theatre,” he says. “Just as cinematographers in the era of German expressionism were shooting physical objects against a painted background, the approach is to trick the eye and make everything work exclusively from the camera’s perspective.”
However, rather than just filming what is on stage, the DP’s role now expands to something akin to show designer, including advising on physical set design.
“It’s about disguising or bridging the gap between physical and unreal objects,” says Medcraft. “To do that successfully, you need to be involved from the very early stages of production, from prop design and the actual layout of the LED stage all the way through to VFX and working with the Unreal designers.”
Experienced VP cinematographers like Medcraft stress that almost anything can be shot in an LED volume, provided the flexibility to do so is built into the design. This requires that the traditional camera unit led by the DP teams up with sets of games engine artists and technicians to plan, test and iterate shots so that everything goes smoothly on the day.
OLD AND NEW
Linking virtual with physical production design in a VP set-up is typically the job of the VP supervisor, a new role described by Asa Bailey as “an enabler”.
Bailey, a cinematographer recently appointed as Netflix VP Supervisor, explains; “The technical creative interpreter should offer solutions to DPs that they are not aware of, and to guide and assist with everything from the preparation of the wall and assets – be that 2D plate capture, stitched scenes or real-time fully virtual environments.”
Approaching virtual production with an open mind will go a long way to avoiding common problems such as misaligned horizons or plates that are captured too wide for the lens selection to match.
“A lot of this can be ironed by testing on the wall itself,” says Bailey. “Ideally the one you are going to shoot on.”
This presupposes that a volume is going to be the best solution for the scene or project.
The reasons why it might be range from health-related travel restrictions to the economics of location versus studio work. The vagaries of location work can be negated since a VP stage delivers predictability and scheduling security.
Making decisions about what if any aspect of a production is best achieved on location and what is best shot in a volume happens as normal during shot breakdown.
In selecting a volume, the aim “is to recreate the authenticity of a location,” says Johnny Johnson, senior immersive technician at the National Film & Television School’s StoryFutures Academy (SFA). “How well you do that is down to a lot of parameters, including the skillset of the technicians and VFX company.”
A big part of this will be blending the physical with the virtual in-camera, for which matching lighting in the game engine with the practical set is key.
“You always need physical lighting. It’s myth that you do not,” says Johnson. “It involves a two-way relationship between gaffer and the Unreal content department.”
CHOOSING THE RIGHT KIT
The things that work well in a standard volume are mid-range 35mm wides. The bigger the stage, the wider you can frame but the cost goes up. On a 10x5m wall, for example, “with good pixel pitch and tight technical control you can shoot a hell of a lot, but you will struggle when it comes to the wides,” says Bailey. “Sometimes that’s not possible.
“So, when a production goes on location to shoot wides or scenes, an additional capture process is applied to capture the asset from that location to be brought into the studio. Then, not only can repeatable scenes over the course of a series perhaps be shot within the volume, but the wide scenes may be done with compositing or green screen and more traditional approaches.
“You can fast pan but you’d need tight tracking (and therefore a tight tracking crew). You can shoot handheld, provided you plan for it. Set up a different camera angle? No problem, just design it in (so that the set is wider to allow for it).”
It’s true that you can’t focus sharp on the wall (until LED pixel pitch narrows to something imperceptible by the camera) but why would you want to, asks Bailey. “That’s not what you are using VP for. Your subjects are always in front of the wall. You can set the focus for infinity if you want and you can forget the sharpness of the wall, but if your focus crosses the wall you potentially get moiré. An accomplished VPS and DP can shoot around these things.”
Colour is critical and LED panels will display colour differently but creating a standardised LUT for the wall is advisable. What is less common is having a single source of truth for the colour science on set that tallies between wall, games engine and camera. A virtual production supervisor or similar senior technician-cum-creative can lead on this.
Of course, virtual production encompasses far more than principal photography in a volume. It can mean scouting for locations and camera positions using real plates and / or computer rendering in a VR environment. It could mean motion capture and green screen.
AGILE MINDSET
For a DP like Bailey, whose background is in advertising where projects are typically built from agency storyboards and shot concisely, VP is a natural evolution. Other filmmakers with different styles or experience may find the transition more difficult.
“For some filmmakers who like to get a camera and just shoot, the concept of being so proscriptive is daunting,” Bailey says. “VP does take planning and a great deal of preproduction and visualising beforehand. It’s the way of working which is one of the biggest challenges.”
Another challenge is having to adopt a more collaborative approach than the rigid hierarchy of traditional filming.
“Filmmaking is equal in many regards, but there is a potential clash with the software designers of the virtual world who inhabit more of an agile methodology,” Johnson observes.
The workflow is not linear. It’s all about being able to jump between tasks and having the responsiveness to manage the competing pressures. The VP supervisor, for example, is a new ‘above the line’ position and part of the creative leadership.
“They sit with the production designer, VFX supervisor and DP, not in competition with them,” Bailey says. “But there can’t be some utopian democracy. Production, virtual or otherwise, has to work like a machine. The fallacy of VP is that every aspect is interchangeable, an environment where you can do whatever you
want. You can – in pre-pro. But when it comes to photography, the economics dictate that there has to be hierarchy on set. Decisions need to be made and if you don’t make them ahead of time, you will fall into the trap of delays on a shooting day.”
SPREADING THE WORD
This is perhaps the biggest educational gap that a swathe of initiatives has sprung up to address.
“The lack of filmmakers who understand software and games engines is where we see a huge shortage,” says Johnson.
The six-month program at SFA is set up for students to learn theoretical creative practice and then go to experience that for real on a VP stage. The course fees are 75% underwritten by WarnerMedia.
Elsewhere, MARS Volume, the UK’s largest independent virtual production facility, has teamed up with ScreenSkills and Hillingdon Council to launch a range of vocational VP training programmes in September 2022. The courses will offer students both virtual production theory and hands-on practice, with training from MARS Volume’s in-house team and industry partners such as Lux Machina and Epic Games.
True innovation and new forms of screen entertainment only emerge from the convergence of film, gaming, VFX and computing worlds.
“We’re looking for the aesthetic that the film folks bring to the table, and we’re looking to the interactive [real-time domain experts] for the interaction design,” says Vicki Dobbs-Beck, president, Immersive Content Innovation, ILMxLAB at LucasFilm. “The magic happens when they both bring the best of their past but are willing to learn from one another.”
Discover Europe’s vast range of virtual production facilities on our website here.
–
Words by Adrian Pennington