Technological trailblazer
Virtual production pioneer project The Mandalorian gets its own spinoff where the technology is equally pivotal.
With the arrival of Ahsoka, one of the most influential characters from the animated Star Wars universe gets her own live-action series on Disney+. Older, wiser and same sense of recklessness from her youth, Ahsoka Tano is portrayed by Rosario Dawson. David Filoni serves as the showrunner as he did with Star Wars: The Clone Wars and Star Wars Rebels which explored her journey from Jedi apprentice to abandoning her training with Anakin Skywalker and the Jedi Order.
Being initiated in the ways of virtual production and the Force were cinematographers Eric Steelberg ASC (Ghostbusters: Afterlife) and Quyen Tran ASC (Pali Road) with lighting guidance for the eight episodes provided by gaffer Jeff Webster, who has worked on The Mandalorian and The Book of Boba Fett.
“Any shot on any given day was more complicated than that opening shot in The Front Runner,” notes Steelberg. “I walked into this only seeing the volume stage once but did not even step on it.”
David Klein ASC recommended him for Ahsoka and he was upfront during the job interview. “I told them, ‘I’m eager to learn and love exploring new technology as tools to tell stories in new ways,’” Steelberg recalls.
Upon starting prep, he was able to spend time watching the team shoot the third season of The Mandalorian in the same studio. He was also able to speak to Klein, gaffer Jeff Webster and key grip Walter ‘Bud’ Scott about how they lit and the limitations of the volume.
“It was just as difficult on season one of The Mandalorian as it was on Ahsoka,” remarks Jeff Webster. “The sets are always unique and there are always unique challenges with, ‘How are we going to sell this virtual content as a real physical set?’
The hardest part is dealing with the physical restrictions of the space that you’re given because it’s the same size for everything you put in there.”
30 percent of the show involved virtual production. “The lighting was challenging because the light of the background content largely drives what you can and can’t do in volume,” states Steelberg. “The matching is time-consuming. It requires pre-lighting and going back and forth with the visual effects team that runs the volume and making it all work.”
Every scene is previs and extensive VR scouting is conducted. “I can get in there and potentially look at the virtual content as it’s built in the engine itself,” reveals Webster. “Then we can overlay the physical volume wireframe into the virtual content so I can physically go look and see if I could rig something. VR scouting is definitely one of the major improvements that I’ve seen.”
Some aspects were as expected. “I anticipated certain things looking better,” remarks Steelberg. “I enjoyed the fact that you can create an ideal lighting situation that could be kept for as long as you wanted.”
Soft light works the best. “It’s hard to get light outside of the volume because you can’t take down walls and put them back up,” states Steelberg. “To get hard light far enough away outside of the volume to cover the entire working area is difficult so soft light becomes a much more successful blend.”
There was not much in the way of tungsten lighting. “For the output we go HMI because the white balance of the video wall is around daylight so we’ll usually use daylight fixtures if we going to be using big lights and need more output for the sun,” remarks Webster. “But normally LEDs because they give us better colour control than a classic tungsten or HMI fixture.”
The camera package and lenses were inherited from The Mandalorian, although the filmmakers did look at other options. They ended up staying with what they used, which was ARRI Alexa LF cameras, Caldwell Chameleon Anamorphics as the main lenses, and supplemented those with an expanded wide angle Super 35 Zeiss Master Anamorphic, one or two Atlas Orion wide angle lenses and some long more telephoto Cooke Anamorphic lenses.
Steelberg favoured 45mm, 75mm and 80mm lenses with a close second being 100mm. “I find with anamorphic that I usually use three focal lengths and pull out a 135mm for close-ups and the wide being a 40mm or 45mm.”
Mostly one camera was utilised. Steelberg notes that it can be tricky to shoot multiple cameras unless they’re close to each other, capturing the same thing wide and tight, or you have overlapping frustrums.
All the spaceship hangars and the Coruscant shipyard control room were captured utilising a virtual production methodology. “Those were easier to light because the lighting in the hangars already feels messier and ambient so you can get away with a lot more,” observes Steelberg. “In that case it’s about putting up some large soft sources to try to add to actors or actions so it feels blended into that background. When Sabine Wren [Natasha Liu Bordizzo] goes into the Noti camp and is reunited with Ezra Bridger [Eman Esfandi] it was challenging as a day exterior because that was a soft light with a definite direction of lighting from one place, and a lot of different colours and textures going on. Or when Sabine is ambushed and fights this group of bandits while riding her howler; that was tricky because of bandits’ orange outfits, and the blues and purples of her hair and costume.”
While the LED screens are at good at displaying video at a certain brightness, colour accuracy is not as reliable. “When we pre-light the scenes that we’re shooting in it a lot of times we’re colour balancing the screens with the camera and stand-ins in the volume,” explains Steelberg. “The volume will typically put out a magenta light which on a character with orange skin can be problematic. The colour that the camera sees is different than the colour of the light that the screen puts out. This is because there are these LED diodes that put out these different colours at different angles so it gets complex.”
An important atmospheric was added. “What we did do a lot of was wind and that was a big thing for David Filoni for all of the exterior scenes because it’s an element that makes you feel like you’re outside.”
Getting to work inside of the volume was a worthwhile experience for Steelberg. “It’s a slower process but you can get incredible results.”
Lessons learned can be applied elsewhere, Webster notes. “I’m hoping to get a little bit more accustomed to working in Unreal Engine. Maybe there’s a chance that we’ll start implementing Unreal into the way I pre-light a non-volume set. By getting digital twins of Creamsource or ARRI lights, now I can start lighting a set with a SkyPanel softbox and see what that looks like. It will make me better at my job.”
–
Words: Trevor Hogg