THE REAL AND THE UNREAL
“In my experience, most DPs get VFX. They’re often a technical bunch who love cameras and gear so they get what we do,” says Geoff D.E. Scott.
Scott has supervised his share of visual effects, both on celluloid-shot movies, like Richard Donner’s underrated thriller 16 Blocks and Robert Eggers’ career-launching The Witch, to TV and streaming series like Orphan Black, and currently Snowpiercer, where digits are wrangled all along the workflow — to paraphrase Bob Dylan.
And where the distinctions in that workflow, between “production” and “post” (or even between “previs” and the rest) become increasingly hazy, in an era where shows can be as much “rendered” as they are “shot”. Similar to the oft-eroding distinctions between what a DP captures in camera and what a VFX supervisor adds, extends, or finishes, later on.
For Scott, it also means working “very closely” with cinematographers like Orphan Black’s Aaron Morton, on aspects like “designing the clone shots and the choreography of the camera and actors (and) often bouncing composition ideas off of each other.”
And if anything, those kinds of working relationships have become even more intertwined in the time since.
“With Snowpiercer it’s completely different. Working with (cinematographers) John Grillo and Thomas Burstyn there’s a lot more crossover than just a few key scenes. We always storyboard the key exterior sequences together. In the room will be the director, John or Thomas (depending on which block), myself and our storyboard artist Marcus Endean. We plan out the big exterior shots together, discussing lighting design and how it will impact the interior shots. Then when on set, we walk through what’s planned with storyboards and animatics to help inform us and keep us on track.”
And staying on track is something that DPs and VFX supes can do for each other, as Ross Emery ACS (main image), observes. He’s been doing a lot of effects-heavy work for Sir Ridley Scott lately, not only on a series of Alien: Covenant shorts, but as one of the cinematographers on HBO’s Raised by Wolves.
“Since visual effects has moved into the world,” he notes, filmmaking has found “a much more fluid way of doing things. It’s becoming a symbiotic relationship with VFX supervisors — they have to fit in more with production schedules, with the dailies schedule; they’ve been moulded into this department that works alongside the production crew.”
Working alongside those crews is something Emery says he’s “always enjoyed (but) it’s become more necessary in recent times. In a sense,” he says of his VFX colleagues “they’re becoming a bit more like what cinematography was.”
It’s becoming a symbiotic relationship with VFX supervisors — they have to fit in more with production schedules, with the dailies schedule; they’ve been moulded into this department that works alongside the production crew.Ross Emery ACS
And of course, what it used to be was entirely distinct from the production phase, back when they were known as “special” effects. In the era of a tabletop creature master like Ray Harryhausen, for example, the film would be shot (usually somewhere around the Mediterranean, in his case!), and then the effects, whether a cyclops, statue come to life, fleet of skeletons, or anything else, would be added in after, when everyone else was either home or on to the next project.
But Emery returns again to the idea of the VFX department “coming into that world” of production. “They have to make decisions on the shot, on the day,” which also means they have more input with the DP and director: “We need this, we need motion control, we need a blue screen…!” But the flipside, as Emery notes, is that “I get to contribute more for the visual effects department.”
And of course, those departments aren’t simply making light-sabers swoosh, or wizard brooms fly anymore. The Visual Effects Society, in its annual award show, has two culminating categories, which have recently been redubbed as ‘Outstanding Visual Effects in a Photoreal Feature’ and ‘Supporting Visual Effects in a Photoreal Feature.’
Recent winners of the former category include what we would normally think of as “effects-driven films,” like The Lion King, and Avengers: Infinity War. But the latter category includes more “regular” types of dramas, comedies, etc., which can also be heavily “rendered’ along with whatever is filmed — especially in period pieces. Winners in that category have included movies like The Irishman and Clint Eastwood’s Changeling.
As for Lion King, Johnny Renzulli, the founder of VFX house Chicken Bone, considers the film, along with Jungle Book, one of the harbingers of when “we started to move into the era of virtual production — blending the real with the virtual in a more seamless way. Those two features are a really good example.”
That new and growing seamlessness “puts tools in the hand of virtual artists, then hands them back to the cinematographer.” And it’s not just a matter of blending what used to be “production” and “post” either. Even that phase of filmmaking known as “preproduction” has come along for the ride, as Renzulli says these same virtual tools even take previs to the next level — and “creates more of a high quality 360-degree environment,” in which one can use “VR headsets for scouting, (and) walk around,” just like you would on a real location.
But aside from the kudos he reckons are owed to pioneers like Jon Favreau, and post houses like ILM, he also agrees that “Covid is accelerating some of these efforts.” For DPs connected through fibre-optics, Renzulli says “we can actually give them a physical camera (or its simulacrum) with a lens and an eyepiece, and they can be looking at a virtual world as if it’s really there.”
He actually sees a kind of “backward loop” unfolding: “We’ve given them back their original tools.” In other words, DPs can look through viewfinders, and plan shots. “Give the cinematographer back their camera, and let them do what they’re really great at.”
“There is,” he continues, “a lot of fluidity, and real time feedback that wasn’t there before. Creative decisions that always need to be made will naturally now include the visual effects supervisor in those conversations.”
Some of Renzulli’s most recent conversations in that regard had to do with the new Netflix show The Queen’s Gambit. Adapted from the Walter Tevis novel, and starring Anya Taylor-Joy, it tells of the coming of age of an orphaned female chess prodigy in the 1960s.
Chicken Bone was one of the vendors, and worked on the show’s chess pieces, which seemingly come to life, and which Taylor-Joy’s character Beth finds herself consumed by. They not only rigged, animated and integrated all the bishops, rooks, et al., but also worked on 3D set extensions, mattes, and other period-specific aspects.
Renzulli calls it “largely a traditional visual effects show,” mentioning that DP Steven Meizler, who’d done a lot of first assistant camera work on Spielberg productions like A.I., Minority Report, and War of the Worlds, and VFX supervisor John Mangia, who was just coming off shows like The Code and The Society, “did an incredible job of building a team of collaborators. (They) immediately hit it off (and) conceptually approached how they would approach each scene.”
As for Chicken Bone, they frequently found that they’d “move from Meizler’s practical environment, and then move into Mangia’s digital (one).”
Often connecting them both were the LED panels they worked with — Renzulli mentions not only green screens, but LED backdrops, and notes that more “retro” effects often won’t blend with them, at least not in a live shot: “It’s not recommended to use practical effects or atmospherics in concert with LED panels,” he says. “It interrupts the visuals quite a bit (often causing) a lot of filtering of light that would normally be built into the scene.” He notes that now there’s really no reason for such a trade off, if the desired effect is “something that we easily achieve in post with no overhead.”
LED panels — used to great effect by DPs Greig Fraser and Barry Baz Idoine in their Emmy-winning work on The Mandalorian, where digital “volumes,” or virtual sets, can render real-time effects while shooting — are also likely, in the estimation of Scott, to help usher in “the next big leap in redefining how cinematographers and VFX supervisors work together.” He mentions specifically “LED wall virtual productions,” and adds “I’ll be super excited to see how this process will be accepted and used in the next few years.”
But Emery notes there are lots of ways to keep refining how DPs and VFX supervisors work together right now, starting with the files coming out of the camera.
He mentions how much of the talk about changing camera technology keeps focusing on resolution, 4, 6 and 8K, and “Blackmagic just released a 12K camera (but) resolution is just a mathematical computation, of how many pixels you have. It doesn’t discuss colour representation, luminescence (or overall) image fidelity,” all of which can have more to do with the choice of lens, and the way the film (or “film”) is either processed, or transcoded.
“That,” he says, “is where I’d like to see the conversation go.” The lens selection, he emphasises, is “one of the chief creative choices a DP makes,” and mentions the extensive use of “rehabbed glass” used in “a lot of films shot on lenses made in the 1970s.”
While this may provide grain, soften some sharp edges, allow for interesting flares, and provide an overall more “filmic” feel, he also notes that sometimes a kind of Owen Roizman/Haskell Wexler-era look doesn’t have the digital clarity that the FX house often needs. “If you shoot the VFX shot on (those) lenses, sometimes — you don’t get a great composite,” or great reference footage to use for those sequences. “All the focus is kind of off on the edges.”
So, Emery says, “I’ll shoot it as clean as possible,” but then he’ll let his FX supervisor know that he wants them “to apply all the aberrations of a 1970s lens” after the sequence is finished, often having his assistants lens map an “old crappy lens” to let the effects side know the look he’s after.
“Visual effects can do both of those,” he says, using their “crisper” files to do their work, yet still making it look like “the rest of the film,” shot on the DP’s preferred glass.
Though outside of rehabbed or crisp glass, and pixel counts, another thing that Renzulli likes about modern cameras is the metadata: “As much metadata as you can throw out, we’ll take it, all day long.”
As for where collaborations can get even more frankly collaborative in the future, he says he “would love for DPs to familiarise themselves at least conceptually with the Unreal Engine,” and other digital tools like Notch and Unity, themselves usually the “motors” behind those LED-arrayed “volumes” and walls, as they have been for many immersive gaming environments.
While outfits like “ILM continue to make digital tools easier to use,” he says, understanding “the difference in how an engine calculates the real world and how the real world really is,” may help answer some of those perennial production questions in advance, like “‘how come it’s not doing this?’ ‘We expected it to do this.’” And, Renzulli notes, those conversations shouldn’t only be restricted to cinematographers and visual effects supervisors: “We want to make that same effort for all the different departments in the workflow. There is,” he emphasises, “a lot of room for creativity in this new ecosystem.”