The NAB story starts here
Across The Pond / Mark London Williams
The NAB story starts here
Across The Pond / Mark London Williams
Lead image: ICG Panel at NAB 2019 (l-r): ASC president Kees van Oostrum; DeLuxe senior colorist Andrea Chlebak; Ryan McCoy, senior previs/postvis supervisor at Halon; cinematographer Christopher Probst; VFX supe Jim Berney, and ICG’s resident technology specialist, Michael Chambliss. Credit: Greg Gayne
“We went into a motel for a week.” That may be one of the most Vegas-sounding lines you’ll hear at NAB, though the setting for the tale was on the opposite coast, away from Sin City’s noonday heat, and decades in the past.
It was inventor Garrett Brown, holding forth at the Tiffen booth, talking about the history of his Steadicam, and how a seminal piece of filmmaking equipment whose arrival now seems inevitable, in retrospect, had numerous near-misses en route to market.
(read more about Garrett and Steadicam's story - originally in BC75, May 2016 - HERE)
The avuncular Brown, always at ease in front of crowds during each of his live presentations, said that with the prototype, they’d “got in the habit of making impossible shots.” Except those were all in 16mm, and he was asked for 35mm footage as more “proof of concept.”
Which lead to being holed-up in that motel, reworking the mechanisms, gimbels, etc., which ultimately resulted in the 35mm version that immediately sparked some cinema history in the ‘70s, both with Rocky running up those fabled Philadelphia museum steps while training, and Haskell Wexler’s epic crane-into-Steadicam shot that helped open the Oscar-winning Bound for Glory.
We mention Brown, because as immediately impacted as a NAB, or National Association of Broadcasters, schedule can be - meetings, appointments, and events are generally arranged weeks in advance of the show itself - there is still some serendipity involved in simply wandering the aisles, as there was coming across Brown’s talk while en route to one of those aforementioned appointments.
And if serendipity should play a role in events hosted in a city known for its insistence that your own luck is about to change, so it was in ferreting out a theme to this year’s gathering. NAB was embracing its numerous constituencies - from its original radio/TV broadcasters to vast new groups of vendors and attendees now that “broadcast” means “to stream,” and streaming, basically, means “just about any digital content you can think of” - by simply marketing this year’s motif as “story” (whether you’re making them or consuming them). But it turns out that another of this year’s overarching themes were the letters “GPU.”
By which, yes, we mean a graphics processing unit, as opposed (or in addition) to the “central,” or “CPU” kind more familiar in computers everywhere, and in how render farms, workstations and post houses have been laid out since the ‘90s.
This was the story at Maxon’s annual press luncheon - one of the early kickoff events to NAB. But instead of their usual panel, they had their recently minted CEO David McGavran holding forth on the rendering and 3D modeling company’s recent acquisition of… another rendering company! Namely Redshift, known for its own “Redshift Engine,” which is described as “GPU-accelerated,” and is part of the Grail-like search for “real time” rendering, to better serve live broadcasts, AR-enhanced entertainment, and more (as a side note, it may be there was more buzz around the possibilities for AR - “Augmented Reality” - than its Virtual VR sibling, at the show…)
McGavran was holding forth with Redshift CEO Nicolas Burtnyk, who described his company as mostly comprised of engineers, rather than marketing people. Now with Maxon, they’ll be marketing Redshift’s upcoming “RT” engine, using what McGavran calls “now critical” GPUs, for the “photorealistic” rendering expected to spread to additional production pipelines.
GPUs were also the order of the day up in RED’s demo rooms, above the main convention floor. We’d written previously of their partnership with Nvidia, whose processing alacrity now make working even with 8k imagery possible in proverbial “real time,” thanks to its use in the new RED R3D SDK and REDCINE X-PRO software offerings. One of the upcoming “grails,” then, is still to incorporate such rapid transcoding into editing software. In the meanwhile, though, you’ll be able to transcode those info-chocked files even on a laptop - as long as you’re outfitted with the right GPU cards.
Of course, one of the ironies of our digital moment is that sometimes images can look too sharp, as far as conveying a certain tone, or emotion - hence the popularity of “rehoused glass” on the lens turrets of all those 6 & 8k boxes.
Canon has taken heed, and was introducing their “Sumire Prime” lenses, which everyone wants to pronounce “Sue-mire,” upon reading it. But it’s actually enunciated “Soo-mee-ray,” from a Japanese word evoking a kind of floral beauty.
Tim Smith, who manages the Pro Market division for Canon, noted that it was a cultural shift for the company to have given the lenses “a name, instead of a number. The concept is that flowers get more beautiful as they open. This is similar to that - as you stop down, the look changes. There’s character built into the lens,” including “lots of interesting things happening in the corners,” like softer skin tones, lens flares, etc. “That’s very untraditional for Canon - thinking about lenses in more of an organic way, and less of an engineering way.”
“When you get above 2.4,” he continues, “they become much more conventional. So for the people who need that kind of look, T4 and up is going to be tack sharp, and high contrast. And they’re also full frame,” Smith adds. “There’s been a real need for modern, full frame glass.”
Glass, and all the other needs of the full frame/digital era were being discussed upstairs right after our stop at the Canon booth, at a panel sponsored by ICG’s Local 600, whose topic was charting where the modern imaging process “begins and ends.” If such parameters can be said to apply to “image capture” anymore.
The roster was comprised of ASC president Kees van Oostrum; Ryan McCoy, senior previs/postvis supervisor at Halon; VFX supe Jim Berney; DeLuxe senior colorist Andrea Chlebak; cinematographer Christopher Probst, and moderated by ICG’s resident technology specialist, Michael Chambliss. The panel was part of NAB’s “Birds of a Feather” track, which looks at the various overlaps, compatabilities and occasional conflicts in the workflow between cinematography, visual effects, and postproduction.
Of those workflows, Probst said there was “never a negative effect bringing in the cinematographer early,” especially at the previs stage. On the opposite end, Chlebak found it “strange that major DPs weren’t coming in,” to work on color correcting, though often that can be due to budgeting, and/or a particular DP’s schedule.
As for previs, van Oostrum opined it was possible to get “lost” in it, and that the cinematographer’s task was “no different than it always was - to create the look of the film with the director,” though he later acknowledged that in genre films, the production designer might be the main collaborator for a story’s mise-en-scene.
At which point, the whole live rendering/GPU theme reared up again in an audience question on whether LED technology was getting to the point where you’d finally be able to do “live compositing.”
“I’d be all for it,” Berney said, after he’d recounted numerous tales where there seemed to be as much post-production work to make a film coherent, as there had been during the actual production - if not more.
Probst noted that you’d still have to create the FX in question in advance, for the live compositing, and that the LED screens had to be large enough to cover the set, so that “the pixels wouldn’t show.” Though he’d had some success for Netflix’s Mindhunter series in creating a “plate vehicle van” with a dozen cameras in it, “capturing all the information that was needed,” so that later, the protagonists could “have their driving conversation on stage,” realistically replicating any number of open roads.
The use of real-time data was even one of the main themes at venerable Cooke Optics, known, for their history with glass, rather than digits. And indeed, variations on that storied glass were on display - with hitherto unseen lenses from their S7/i, Panchro/i Classic and Anamorphic/i Full Frame Plus ranges.
But they were also there with “i Cubed,” or /i 3, which isn’t the title to the sequel of an Isaac Asimov robotics book, but rather, a new version of their own metadata system which sends detailed lens information to VFX and post-production teams, including distortion mapping.
Cooke’s jefe, Les Zellan, called this part of his “metadata quest,” which, Quixote-like, had its doubters even as recently as three years ago: “People didn’t know what metadata was,” he told us. “‘Why are you doing this?’,” they’d keep asking him. “Those days are gone,” he affirms, but even so, only a handful of labs are equipped to make regular use of the digital info. “We want to make it industry-wide,” he said.
And so - they’ve opened sourced it. Or, at least, “given it away all our competitors. Zeiss uses it,” he notes, “Angénieux, Leica has it, Canon, Fuji…” and then Zellan corrects himself about the “giving away” part. “We charge them one pound a year,” he says, as part of the contractual arrangement. “Really sets ‘em back!” But he notes some companies are “so fastidious about it,” they not only ask for an invoice, but then wire the single pound back to Cooke - racking up charges far in excess of what was actually “owed.”
And even though Zellan would be happy to take the stipend via a friendly glass of beer, the companies are clearly getting far more than their money’s worth: “We’ve taken what used to take two or three hours,” in terms of compiling data, “and compressed it into ten seconds,” he continues. “This should be music to every producer and effect person’s ears.”
And it doubtlessly is - along with the humming of all those GPUs, relentlessly rendering data, in 6 and now 8k formats. But what better place to point to the coming of new technology that will continue to supplant real life, in real time, than Vegas?