RISE OF THE MACHINES
Controlled remotely or programmed in advance, robotic camera supports provide precision, repeatability, and the power to get the lens closer to some subjects than ever before.
Cinematographer Rob Drewett got into natural history filming via a love of scuba diving. It was while filming underwater, “being able to move the camera around as freely as possible without gravity taking force,” that he started to think about new methods of camera support.
“There was one time where I was in India… filming peacocks… but these peacocks were being a pain and not coming towards me.” He found himself looking at a gimbal rig he had used for another recent project, “and I was just thinking, if I get that gimbal and invert it and put it on a car or something like that, maybe I could then get closer to the animals instead of me waiting for them to come to me.”
These were the first steps of a journey that saw Drewett teaming up with product design engineer Andy Nancollis and founding Motion Impossible. Drewett’s car idea became the Agito, a modular remote camera dolly.
Three Agitos were utilised by Cineworks in a recent H&M commercial, and some of them appear on camera in the finished product. “I think they like the fact that it’s futuristic looking, and it’s not got a person,” say Drewett, “so you’ve not got a character to be in shot. It’s a bit like wildlife, really, because an animal doesn’t know what an Agito is, and they don’t freak out… A person with a tripod is not allowed to walk around an animal that’s protected. You’re not allowed to grab a sequence as it happens.”
Drewett shot an upcoming Disney programme about hyenas with his Agito. “We are driving at about 20mph about 45cm off of the floor, stabilised 1,500mm, filming hyenas walking across sand. But if we need to, we can then not be 1,500mm, go to a wide, get nice and close, and we can capture those moments instantaneously… Using a good setup like a Shotover M1 gimbal with a Canon 50-1,000mm you can cover a lot within a short space of time. I am free to be away from the vehicle, so I’m in a car. I can control the whole of that camera, and my Agito operator controls the movement from about 500m away with no problem at all.”
Far from the African plains, robotics combine with virtual production at ARRI Stage London. Director-DP Brett Danton has been working there with WPP’s Creative Tech Apprenticeship programme, bringing the apprentices’ ideas to life. One of the tools he uses is a Cinema Robot from High Speed Films.
“We had a girl jumping off the top of a building,” he offers by way of example. “You’re in the volume stage… The little platform that you build in the studio to the walls – there’s not a massive amount of tolerance levels to pull off the illusion… If we did it traditionally with a crane, yes, they’re very good, but they can’t always hit those marks all the time.”
Danton goes on to describe an intriguing new workflow. Normally the quality and complexity of virtual backgrounds is limited by what the volume’s processing power can render in real time as it tracks the camera move. However, if the camera move is pre-programmed and repeated precisely by a robot, the background can be rendered in advance to any desired quality.
The scene is first blocked with the cast, and the camera move recorded. Then, in the time a traditional shoot might take to light and do checks, the robot’s movement data is used to render the background with a highly realistic but processor-intensive method called path tracing. “So basically inside 20 minutes we are rendering photorealistic backgrounds and therefore really upping the quality inside a volume stage,” Danton explains.
Another emerging combination of robotics and virtual production is the use of pre-visualisation to save time on stage. “You can program out your move,” says Danton. “You get everything set, and you can transfer that into [the robot] when you get into the volume stage… because it’s an expensive place to shoot.”
Didier Daubeach prepared thoroughly when he used robotics in a Samsung commercial highlighting their phone’s folding design. “This project is a perfect fusion of technology and art, where light plays a crucial role,” comments the DP, whose collaborators included choreographer Sadeck Berrabah. “Sadeck, known for his participation in major television shows and international events, developed choreographies based on the body synchronisation of the arms and hands of the group of dancers… similar to a sea anemone undulating with the current. Director Stéphane Barbato wanted to add a unique visual dimension by equipping each dancer’s forearms with LEDs. These lights, synchronised with their movements, had to be tracked in real time by the camera.”
Daubeach decided that a traditional crane or dolly would not be fast or precise enough to follow the quick and complex moves of the 64 dancers’ arms. He instead selected Cinemotion’s Ultra, a robot arm with a reach of 3.1m and top speed of 5.5m/s. 3D pre-visualisation was used so that Barbato could see and adjust the camera moves ahead of the shoot.
“My goal was not to create spectacular ‘gratuitous’ movements,” Daubeach points out, “but rather to integrate all the artistic constraints of movement, rhythm, and light. The robotic arm had to become an extension of the choreography, a tool capable of tracking the LEDs with millimetre precision, without ever being late or early, while respecting the nuances of the music and the dancers’ expressiveness.”
Adrian Langenbach also required millimetre precision when he shot a commercial for Märklin, a storied German brand of H0 gauge model railways. “Our goal was to immerse ourselves in this enchanting world by telling a simple father-daughter story,” says the DP. The father and daughter join stop-motion figures in the miniature model world.
Langenbach employed the Milo from Mark Roberts Motion Control, a track-mounted system with an arm reach of nearly 5m. “We first filmed the entire miniature set – including clean passes as well as numerous passes in which we animated the various layers and elements. Milo offers the highest level of precision, allowing us to repeat the exact same camera moves over and over again, then seamlessly merge them in post-production.”
The Milo was operated by Julian Hermannsen from Visual Distractions in Hamburg. “We primarily addressed the impossibilities in a 3D animatic before moving on to creative camera work on the real set, which we approached interactively,” says Hermannsen. “Given the project’s scale of 87:1, it was crucial to keep its limitations in mind.” Some shots required parts of the set to be cut away so the lens – an Innovision Probe II snorkel – could get close enough to the 1.5cm-tall figures.
Langenbach adds: “All the moves and camera motions from our panoramic shots were later adjusted to match the real-life size of the actors [who were filmed against a green screen]. We had to account for this already during filming of the miniature because a camera move of just a few centimetres could translate to several metres on the actual set – making certain lift movements impossible.”
“A friend of mine runs a studio that shoots a lot of toys, and they’ve traditionally done that stuff with sliders and little mini cranes,” says Danton. “They’re looking at one of the smaller robots at the moment, because they know it will just work in their workflow. Sometimes somebody says, ‘Oh, can we change out that toy in one of the shots?’ They will save all of their data now on those commercials, and they can go back and repeat that move and swap out that one little bit of product if they need to. So I think that’s where a lot of those tools will start to be used.”
“The younger people are coming through who are naturally more connected to this type of workflow, in the filming world,” Drewett comments. “It’s already happening in TV, live broadcast. They’re always adopting new ways of moving cameras. It’s always been a bit of a problem to get it into features. And I’m starting to see that happening more and more with younger people coming through. They grow up with controllers in their hands, and what we’re presenting is a professional controller.”
“Motion control has been around for several decades, and its applications have evolved over time,” Hermannsen reflects. “While some traditional uses have been replaced by technologies like 3D tracking, new opportunities have emerged, such as moving projection mapping and LED studio environments. However, there are certain effects – like those involving real actors or objects, scaling, and multi-layer shots combining actors with miniatures – that are technically impossible to achieve without motion control. In my experience, once we execute a complex effect on a project, the DP and directors often return with even more ambitious ideas for the next one. Ultimately, motion control is not just a tool; it’s a catalyst for pushing creative boundaries and unlocking new possibilities.”