Home » Features » Opinion » IMAGO News »
Shipshape solution
Inside the innovative Near Real-Time (NRT) on Italian war film Comandante with IMAGO Technical Committee co-chair David Stump ASC BVK, the project’s workflow designer, and his visual collaborators.
Ferran Paredes Rubio, cinematographer;
When I was first approached by Edoardo De Angelis about this project I was a bit scared to be honest but at the same time I was thrilled because it’s a great story that takes place on a 1940s submarine in the middle of the Atlantic Ocean. We had to find a way to make it possible.
At the beginning many ideas were put on the table such as LED walls or projection screens, but the interaction with water made it extremely complicated and very expensive. Pierpaolo Verga, the producer, introduced me to Kevin Tod Haug and David Stump ASC BVK who had been developing a new technique that would allow us to shoot without LED walls or green screens. Everything started to make sense from then on.
I am very lucky to have been Edoardo’s cinematographer since his first movie. By now we have worked together on six movies and two tv series so our relationship is quite strong. He talked to me about Comandante almost two years before principal photography. As we always do, a slow research process began, thousands of pictures from the second world war, movies and paintings but above all plenty of documentaries from the Italian Navy that gave us a lot of input on aesthetics as well as showing us how life on a submarine was during the war.
The ARRI Alexa Mini LF has been my first choice since it became available. The sensor has incredible colour rendition and the sensational images achievable by exploiting mixed lights was of paramount importance for me inside the submarine. The full frame sensor and small form factor was critical in giving the right look in tight spaces and also gave us the needed resolution for VFX and post production in general.
Interactive light is fundamental in a film like this. Great effort was invested in creating a relationship between the VFX effects planned for each scene and the lighting effects on set to illuminate the actors and objects such as the submarine to ensure that the light effect was real. That was extremely important to all of us. All of the ‘previz’ of the VFX scenes helped us enormously and you can clearly see the results on those scenes!
The film is set inside and on top of a submarine, with about 40 minutes of screen time on the deck, so I felt it would be interesting to use a real panoramic 2.39:1 aspect ratio in order to feel ‘little’ in the middle of the ocean. The other half of the film happens inside the submarine, a very small cylinder packed with people and weapons and I was looking for the three-dimensionality that only anamorphic lens give. The angle of view also was something very important in this choice so I knew I would be very close to actors and I wanted to have a wider angle of view with a 50 or 75mm lens feel to it.
I think there is only one set of lenses that makes the right job for each project, so I try to choose them very carefully. We did various tests and discovered how special the Cooke Anamorphic/i FF x1.8 were in colour rendition, in skin tones and overall texture, organic focus rolloff, the elliptical bokeh and flares… somehow everything matched to achieve the images of those portraits from the 2nd World War that were our main visual reference.
The Cooke Anamorphic/i FF x1.8 were just perfect, not only for their full frame capability as well as the Cooke /i Technology lens metadata, crucial for our VFX department and our Near Real Time (NRT) workflow.
Kevin Tod Haug, VFX designer;
The NRT workflow is based on three common-sense but quite uncommon assumptions:
1, A creative lossless asset pipeline that preserves all the metadata available from Previs to Delivery.
2, Near Real-Time visualization/preview of these assets on set, during production.
3, This “set vis” can be used not only by the director, cinematographer, camera operator etc. on set but also used by editorial as “Work In Progress” dailies
Preproduction began with previs supervisor Habib Zargarpour. We had to be sure the digital assets we were using could continue being used losslessly through prep, production and into post. The submarine, fighters and Atlantic Ocean were created as accurately as possible, on various platforms from different sources but always with the intention to use a USD file format to pass them to wherever they were needed all the way through to final delivery.
In production these previs assets were converted to Unreal Engine assets so they could be composited on set with the live camera output using Qtake, live, for the director. Simultaneously they were passed to Nuke in order to generate “slap comps” to show a better version of a shot while still on set.
Using anamorphic lenses has always been a very difficult proposition for VFX work, but on this project we proved that no longer needs to be the case. Live real time tracking of anamorphic lens distortion was so basic to our workflow on set and in post, that it quickly lost its magic and became “normal” but I remember the first time I saw the potential, in one of the earliest tests, I got chills – and then I remember the first time I saw it working live – in real-time, calling late at night and waking workflow designer David Stump ASC BVK up in LA to show him a grid test from my phone… to show him the future.
Accurate real-time tracking of the camera was made possible through a collaboration between the geniuses at Cooke and oARo, developed specifically for this project. Cooke’s excellent embedded lens data was captured in real-time by oARo and aggregated with the metadata from the ARRI, Technocrane, etc. to create one dataset that could be read immediately by Unreal Engine and stored in Nuke to drive all subsequent versions of the composite.
Because we could not use blue, green or LED screens we worked closely with Foundry in the development of CopyCat, an AI “rotoscoping tool”, to speed up the pulling of mattes for temp comps and dailies.
Valentin Huber – Das Alte Lager VFX;
Until Comandante, we had AI/ML tools available, but not yet integrated into our pipeline.
With Comandante, we had the time to dive deeper into the use of Foundry’s CopyCat. In the beginning, our plan was to use it mainly for rotoscoping. After some tests, we got a better understanding of how CopyCat works and we were able to get roughly 90% accurate masks.
Especially in the post-production phase where there was no Picturelock, CopyCat proved to be helpful. Once CopyCat was trained on a shot, it automatically generated additional masks for new frames when there were edit changes. This enabled the rapid creation, usually in a day, of advanced temp compositing for editing.
After exploring the possibilities of Copycat, we decided to use CopyCat for more complex retouching as well. This is where CopyCat particularly impressed us. With just a few frames, we were able to train CopyCat to handle more complex retouching that consisted of several compositing steps.
David Stump ASC BVK – workflow designer;
It can be hard to recognize breakthroughs as they are happening, but there were numerous breakthroughs in VFX during the making of Comandante. Learning to very accurately undistort and redistort anamorphic lenses in real time was groundbreaking. Presenting Near Real Time meaningful composite shots to the Director on set was another breakthrough. Preserving metadata through the entire workflow was a breakthrough. Using AI tools to enable the technique we have dubbed “similar screen compositing” is another. One breakthrough that cannot be emphasised enough is the positive collaboration of the camera crew, the VFX department, post production and VFX to make a quality film on a budget.
Comment / Karl Liegis, head of production, 60Forty Films