Virtual production treads the boards for SMPTE spectacular
“The meeting of three cultures; theatre, virtual production, and cinema,” is how cinematographer Sarah Thomas Moffat describes her work at ‘The Biggest Virtual Production Show Ever’, a full day of demos, panel talks and networking by the Society of Motion Picture and Television Engineers (SMPTE) in Los Angeles.
The event was the brainchild of the organisation’s Christina Nowak, David Grindle, and Ryan Hendricks, PhD, who wanted to highlight and educate the industry about SMPTE’s ongoing work around managing media over ST 2110 IP networks.
Moffat, or STM as she is known, took on the mammoth task of lighting and capturing recreated scenes from popular TV shows within a live virtual production environment. XR Studios in Hollywood provided the venue and facilities, with Lux Machina and Halon providing the virtual art department (VAD). LAMDA Theatrical School partnered on the creative side to support the blending of technology and art, with everyone involved giving up time and space voluntarily.
With two stages at XR Studios, the virtual productions took place on the larger. The 64x33x20ft curved LED wall is made up of ROE panels, offering a resolution of 10560×2112 in 98x20ft of screen space, with a system driven by disguise. It also has a one-piece LED floor that meets the wall in a perfect seam, crucial for VP illusions, but can also take the weight of a vehicle – of which more later.
The second stage was reserved for demos and panel discussions, while an invited audience mingled among the stages. “While all of this going on, I came in as the cinematographer to focus on the actual filming of scenes in a virtual environment to demonstrate what you could do with the technology,” STM explains. “The entire event was also being live streamed, along with an audience in the studio watching this in real time. No pressure!”
Due diligence
Over a 25-year career, Moffat, an Associate CSC member and nominee, has covered cinematography of all genres including VP, underwater and extreme environments, as well as having theatrical and live production experience. She worked closely with production designer Rick Rifle, who also has previous experience working with VP.
“We had three separate scene assets to create: an abandoned gas station at sunset; an alleyway which also doubled for two different scenes; and a futuristic sporting arena,” says STM.
“[We used] Unreal Engine, with Lux Machina and Halon programming asset details and controls. This was critical for our lighting designs of each scene, as you have to light and build the real and virtual worlds at different times, yet they must blend,” she adds. “Aputure lighting was a show partner, bringing in some newer fixtures designed for virtual environments. Gaffer Alex Dumas brought in additional distribution, donated by chief lighting technician Don Mazi Mitchell, while key grip Paul Giacalone donated his company’s ‘Grip This’ truck.
“Location scouting in each engine was mainly done over Zoom meetings, as the SMPTE producing team and Lambda theatre team were in London, while the VP teams and I were in LA. Halon facilitated the walkthrough of the virtual locations, while Lambda director Michelle Bonnard, producer Ross McKenzie, and I decided what might work. During our prelight and rehearsal day in the studio, we adjusted the location perspectives to fit the framing, blocking, and set design.”
Lighting the set
STM adapted camera and lighting setups as if it were theatre. “To me, it is equally important to create a welcoming environment for directors and actors to do their best work, as well as make it look good,” she explains. “The cast was even more exposed on a virtual production stage with a live audience.
“The LED wall and frustum was their backdrop, and set design on stage were their props,” she explains. “All players moved from scene to scene in sync, which in part was helped by the sound design added by Lambda. I used this to coordinate the rolling in of car headlights recreated by the use of two Aputure LS 600c Pros with spotlight mounts, on a low-rise rolling stand and T bar build. For our purposes, it was believable and helped sell the scene on camera, as well as to the audience, while supporting the mood the actors needed.”
To match colourful virtual tubes added into one of Rick Rifle’s scene assets, STM laid down Aputure InifiniBar fixtures on the floor where it met the wall.
“We had fun with it, as Alex could also control the colour during the scene to add to the drama,” she says. “Offside, I had the grips build a 12×12 frame with grid cloth and a honeycomb, with 12×12 blacks draped on both sides to contain the source off of the wall. Inside of that were two Aputure Nova P600c fixtures with full control spectrum, output of 2,298 lux at 3m, an impressive intensity for a small fixture. This giant softbox was my constant key light, while I built in needed definition and practicals per scene.”
Capturing the experience
With RED Komodos with Fujinon zooms, RedSpy camera tracking and pedestals supplied by XR Studios, and a Varotal 30-95mm zoom T2.9 supplied by Cooke, STM created a different feel for each scene.
“1st AC Nat Armenta did a fantastic job of building up the camera to function as we would normally use it, while also maintaining the tracking system built onto the body,” she says. “Pulling focus on moving actors within a volume space takes skill, as Nat proved, to avoid any appearance of moiré in the picture.
STM controlled/conducted proceedings by watching the VP frustum on a rolling monitor next to the stage, but with operations consoles situated at the back of the XR Studio, she could also go back and talk to the virtual production supervisor and the operators about the frustum as well.
“It was decided to only use one frustum tracked to the camera,” she says. “Being a live event, with four scenes rolling into each other like a theatrical show it was more feasible for operations to work this way. The interesting side effect was that it created a lot of negative fill, adding a nice soft contrast and contour to each scene. DIT Daniel Woiwode donated his cart and time so I could dial in my look from the camera; that was then fed to the live stream and onto monitors around the studio for the audience.”
The first scene, a four-minute performance, required the use of a dolly push-in.
“XR Studios had not yet seen the use of a dolly on the LED stage floor – this was an exciting first!” she says. “JL Fisher in LA donated a Fisher Dolly and track, dolly grip Spencer Schunke put the soft wheels on and pushed across the LED floor, while SOC camera operator Jessica Lopez guided the lens. It was perfect!”
The production also used handheld operating. “Coming off the dolly onto the shoulder was a smooth changeover,” she says. “The camera was tethered, and we had crew spotting Jessica as she moved. The final sporting scene was done on the in-house jib with SOC jib op Babak Mansouri, who had a lot of fun shifting the entire base while swinging the arm to create loads of energy.
“It was theatre and story driving this experience to a live audience, but blending this with cinema and then the virtual production aspect was a meeting of three cultures,” adds STM. “The language and the personalities and the logic of each culture is different, so we had to communicate [between] those three worlds to bring all of this into fruition. And it worked. It was really exciting.”
–
Words: Michael Burns