Home » Features » Tech-nique »
(Art)ificial intelligence: a new force in filmmaking?
The AI Rubicon has been crossed, but is the rise of the robots a good thing for cinematography?
To call Stanley Kubrick a visionary would be an understatement. Having added artificial intelligence (AI) to the cinematic lexicon via the murderous computer HAL in the 1968 classic 2001: A Space Odyssey then with A.I. Artificial Intelligence (2001) – completed by Steven Spielberg after Kubrick’s death – he was telling the world AI would one day challenge human exceptionalism.
Flash-forward 22 years and the computer-demonstrated intelligence is a talking point in almost any industry you care to mention. Students have been caught using AI chatbot ChatGPT to “cheat” in assessments, while in June, Sir Paul McCartney revealed that he got by with a little help from AI to extract the late John Lennon’s voice from a demo to create a new and “final Beatles song”.
The same month, the UK government unveiled plans to invest in film and TV AI special-effects research, with circa £150m ringfenced for research labs to help boost the creative economy.
Now that humankind has crossed the Rubicon as far as AI is concerned and the future envisioned by Kubrick is here, filmmakers need to find the best way to work with computers going forward.
Coz Greenop, director of House Red, says AI should be used like all modern tech as a tool for filmmakers to use at their disposal. “Whether it’s used for pre-vis or during production it should be used to help tell the story,” he explains. “AI should not drive a narrative for a filmmaker – it should only be used to help tell a story more cohesively.”
Greenop also thinks AI could help make the process of creating visual effects more efficient, by helping storytellers realise their visions without spending millions on practical effects, whilst pushing the boundaries of physics in a safe environment. “If it allows filmmakers to achieve more within budget then it’s win-win,” he says. “The moment we change our stories, shots, and narratives to accommodate technology, we’ve lost a sense of creative freedom. When I first watched Avatar back in 2009, I saw that James Cameron chose specific shots to accommodate 3D, not because the shot enhanced the story but vice-versa. For me, that is the death of cinema.”
Cinematographer Markus Mentzer (Clara’s Ghost) adds: “I don’t know if it makes creating visual effects more efficient, since it still takes a human eye to figure out whether the effect works or not and then to fix that. I do think it will make the process of using visual effects in film more accessible to everyone.”
That said, AI is gaining traction in movie-making – just ask Christiane Kinney, entertainment attorney at Kinney Law, P.C. who knows of people using bot ChatGPT as a “launching off” point for story creation, typing in prompts for story ideas. However, this approach comes with a legal warning.
“Generative AI is derivative of all that has come before it – although all art arguably is – but if it is being generated from unclean/unlicensed data sets and someone is arguing fair use, they’re already on the wrong side of the issue,” she says. “They’re arguing a defence to copyright infringement.”
Human vs. robot
Advancements in technology across various sectors often evoke concerns about the potential displacement of human jobs by machines and the film industry is no exception.
The International Alliance of Theatrical Stage Employees (IATSE) recently stated its key principles for dealing with the growing use of AI in the entertainment business. Representing below-the-line workers, the union highlighted the need for a thorough strategy to address the potential impact on employers’ business models and its members’ livelihoods.
Kinney also voices fears about the negative impact AI is having on the job market for creatives. “The thing I see most clearly happening is that entry level jobs are being replaced by AI tools,” she says. “How does the younger generation get a foot in the door if entry level positions are obsolete?”
Replacing humans with AI is already visible in the post process, which documentary maker Zach Melnick of Inspired Planet Productions says is helping him immensely.
“The biggest change so far has been with transcriptions, an essential but hugely time-consuming process,” he says. “OpenAI’s Whisper speech-to-text software is, in our experience, better and far faster and cheaper than humans for transcriptions, even getting correct many technical words that unspecialised transcribers wouldn’t be able to identify.”
Melnick is currently using Whisper through “StoryToolKit AI”, which integrates with DaVinci Resolve and allows for the use of text editing inside DaVinci Resolve similar to how Script Sync works in Avid Media Composer. The latest version of Resolve includes its own speech-to-text system and Melnick says he’s “especially excited about automatically creating subtitles inside the app, another costly process that can’t be automated fast enough”.
Furthermore, despite apprehension across the sector, Melnick is sanguine about the potential advantages of AI – notably its ability to stabilise footage, decrease noise, enhance dynamic range, fix timelapse issues and improve resolution.
“We also do a lot of work underwater, which has natural limitations in how far you can see, colour shifts and sharpness caused by the optical effects of water and the beasties that live in it,” Melnick adds. “It would be awesome to have the ability to enhance underwater contrast and colours using AI. I think maybe AI tools will allow us to take bigger risks creatively, with the ability to ‘fix it in post’ turned up to 11 compared with the technology we used to have.”
That said, Eve Psalti, senior engineering director, Microsoft, says there is an opportunity for the industry to focus on reskilling and upskilling its workforce to adapt to the changing landscape. “Emphasising the creative aspects of filmmaking, which require human judgment and imagination, can help professionals carve out new roles that complement AI technologies rather than being replaced by them,” she says. “Also, maintaining a balance between automation and human creativity is crucial for preserving the artistic integrity of filmmaking.”
Diversity and inclusion are other hot topics in the modern and Psalti explains how AI can help create more inclusive and diverse stories and characters in films and TV shows, while still maintaining their authenticity.
“It’s important to point out that AI is a tool so if we train this tool with inclusive and diverse data sets, the results can also be inclusive and diverse and can play a significant role in creating more inclusive and diverse stories and characters in films and TV,” she says. “Recent innovations like AI-powered natural language processing algorithms can assist in analysing scripts, dialogues, and social media conversations to identify potential biases or stereotypes.”
Psalti says that by highlighting problematic language or representations, AI tools can prompt creators to reconsider their approach and make more inclusive choices.
“AI-powered technologies can assist in making films and TV shows more accessible to diverse audiences,” she adds. “By providing automated closed captions, audio descriptions, and translations into different languages, AI can ensure that content reaches a wider range of viewers, including those with disabilities or language barriers.”
As far as translations are concerned, Ray Gillon, managing director/chief executive officer at G-Minor, says AI offers clear advantages.
“When the translations become reliable and trustworthy, AI will speed up things and ultimately the options for lip sync adoption,” he says. “Currently we make decisions that sometimes compromise the integrity of the text and script in the original language version.”
Ultimately, Psalti believes the responsibility lies with human creators to ensure that AI is employed thoughtfully, ethically “and with a deep understanding of the diverse audiences they seek to represent”.
In too deep?
One of the biggest catalysts in the world of computer intelligence is undoubtedly Deepfake AI. Having gained notoriety for its use in the adult film industry, it’s also a tool to bring deceased actors back from the grave and to digitally de-age living ones. It was recently used to make octogenarian Harrison Ford look younger in flashback scenes from Indiana Jones and the Dial of Destiny.
Greenop says actors are copywriting their image to be used for deepfake projects and this could cause more harm than good. “I’m hoping this isn’t the route we are going down as for me the filmmaking process is always about collaborating with creative and like-minded people,” he adds. “The thought of Tom Cruise’s image being used in an independent art house film just doesn’t work.”
Benjamin Field, producer/director of Gerry Anderson: A Life Uncharted, a film about the Thunderbirds creator, has used deepfake technology several times in commissioned work and research and development projects.
“I think I’m probably well placed to know the limitations and the pitfalls,” he says. “Deepfake is essentially VFX trickery and it’s capable of being a very convincing tool for face replacement but that is where it ends. Deepfakes don’t think, react, or emote. They exist, like a clever moving jpeg. Performances are created by performers who understand nuance, subtext, and emotion.”
Despite being a deepfake aficionado, Field is concerned that directors and producers can utilise the technology to lie to audiences. “They can be used to trick you into believing that they are watching the performance of one actor, when in reality they are watching the performance of another,” he adds. “When you do that, you must ask yourself, what is the purpose of that lie? We live, work and breathe in a creative industry, using deepfake to lie to our viewers for profit feels like a huge step in the wrong direction.”
Media organisations can defame dead people without legal consequences, but Kinney says, when it comes to deepfake, individuals have a right to protect and monetise their name, image, and likeness – sometimes even after death depending on the laws where you die – with lawsuits brought by their heirs.
“When you can utilise AI to create ‘deep fakes’ of a person’s voice, image, or likeness, permission and transparency is an absolute necessity,” she adds. “The technology exists and can be a huge financial saving to the productions, for example, utilising AI in the ADR process to pick up certain words that weren’t captured well during a take. This is an important time to be having these conversations and being extremely clear in contracts when deepfake videos or AI-generative voice overs are going to be utilised and how artists will be compensated for such use.”
Kinney sums it up when she says, “at this point, we’re all playing fortune tellers”, but she thinks it’s fair to say AI will potentially play a huge role in the future of cinematography. Mentzer says: “I think AI will increase the value of human creativity in film and television.” The uses AI offers are there to see, but what’s less clear is how reliant on it we might become and how to tell the difference between the work of a computer and that of a human hand. That goes for every sector, of course. After all, how do you know this article wasn’t written by AI?
Comment / David Raedeker BSC / member of the BSC sustainability committee