Animated multiverse feature Rift created using DaVinci Resolve Studio
Jun 16, 2022
Blackmagic Design announced that production company Hazimation has used DaVinci Resolve Studio for editing, colour grading, visual effects and audio post production on the animated feature film Rift.
The project started during the pandemic and was developed with a distributed team using Unreal Engine and DaVinci Resolve Studio to manage all aspects of production. Set for release later this year, a trailer for RIFT will be shown at this year’s Annecy International Animation Film Festival.
Built interactively, Rift relied on DaVinci Resolve Studio as a planning and timeline tool instead of conventional storyboards typically associated with animation. “The process has been incredibly flexible,” begins Director and Producer Hasraf (HaZ) Dulull. “Our production team is distributed and entirely based in the cloud, working to an altogether new, unconventional approach.”
Created and rendered out of Unreal Engine as 4K EXR final pixel renders, the edit updated automatically each time a shot was updated, which removed the need for compositing in an external package to create final shots.
“We’d create a first pass of the characters so that I could begin work creating shots, and then the team would do another pass later in the production once they could see what I was doing with the characters in the shots,” explains HaZ. “The entire process evolved organically, with no need for animatic; instead, the team simply iterated scenes until they achieved the desired look and feel.”
Using DaVinci Resolve Studio as the backbone also meant that the audio integration was faster. As both director and editor, HaZ used the Fairlight Sound Library in conjunction with his sound collection to rapidly sketch out a soundscape, bringing the edit to life. Those would then be exported and delivered to the sound node team removing any need for sound spotting sessions or massive excel spreadsheets.
One of the toughest aspects of the production was finding a style; the film needed to avoid looking like a video game or falling into the “uncanny valley” of animated films. “Mocking this up in Resolve, we experimented with OpenFX plugins to find and test the look before creating the shaders in Unreal Engine.”
A MegaGrant from Epic allowed HaZ and the team to create custom anime style shaders as well as a number of tools and widgets developed by Head of CG Andrea Tedeschi, which gave HaZ control over every element of the shot.
During one collaborative session, the team realized that with all of the different scenes and branching narratives being created, they were effectively building the basics of a video game and seized the opportunity. The game is now being developed simultaneously, with the early access demo available via Steam in June.
Comment / April Sotomayor, head of industry sustainability, BAFTA Albert