Our latest Focus On guide explores the relationship between cinematography and sustainability. We speak to industry leaders about how to be sustainable on set and what changes need to be made going forward.

Kim Libreri / Epic Games



Home » Features » Innovator » Kim Libreri / Epic Games

Kim Libreri / Epic Games

BY: Trevor Hogg

REAL-TIME REVOLUTIONARY

Before becoming the CTO of Epic Games and overseeing the development of the Unreal Engine, Kim Libreri was part of the visual effects team that made the virtual world of The Matrix franchise a reality. 

With an architect for King Farouk of Egypt as a great-grandfather, and the mastermind behind England’s Baxi Boiler heating systems as a grandfather, design and engineering are part of Kim Libreri’s family heritage. Growing up as a child in Leyland, England, Libreri discovered his true passion during the late 1970s. “My father bought me an Atari 800 computer and I was hooked. I used to draw and paint as a kid and became fascinated by how you could teach a computer to make an image.”  

Initially the plan was to go to the University of Southampton to study aerospace systems engineering, with the ambition of designing the next generation of space shuttles. “I received a sponsorship to work at the British Aerospace military aircraft division in Warton and my first job was working on an avionic system for the Eurofighter. While at British Aerospace I was encouraged to go into computer science because they already had so many aerospace system engineers, but not many people who understood computers.”  

During the final year of getting a Bachelor of Science in Computer Engineering at the University of Manchester, the undergraduate worked on transputers, advanced computers that were capable of doing parallel processing, and later, while travelling in Australia, he discovered that Kodak was building the compositing system Cineon, based on similar technology. “I came back to England and looked for jobs that combined computer science, parallel processing, and filmmaking.”  

Kim Libreri behind the scenes on a shoot for The Matrix Awakens: An Unreal Engine 5 Experience 

Libreri went on to work for Cinesite, ILM (Industrial Light & Magic), and Digital Domain, as well as founding ESC Entertainment. “ILM in the early 1990s was a catalyst for the rest of the industry.” A career highlight was working on The Matrix trilogy which was driven by the desire to push technology. “When it came to rendering computer graphics everybody was stuck on rasterizing and we used ray tracing. We injected a lot of hardcore computer graphics science into our processes and a discipline for precision and finesse that imprinted itself on all of the artists who worked on the Matrix movies.”  

The innovative approach towards visual effects led to two Technical Achievement Awards from the Academy of Motion Pictures Arts & Sciences. “The virtual cinematography award was about how you take photography of a real world set and turn that into something that a computer can understand and produce a shot that would work for bullet time,” notes Libreri.  

“The universal capture award was based on us building the world’s first volumetric capture system where we have ‘x’ number of cameras photographing a subject and we reconstruct geometry through photogrammetry on a per frame basis.”  

The release of The Matrix Resurrections provided an opportunity to produce the Unreal Engine 5 tech demo The Matrix Awakens. “All the way through The Matrix Reloaded and The Matrix Revolutions we bought into the idea that even the visual effects should be simulated. 20 years later I sat down with Lana Wachowski and John Gaeta. I told them that I would love to see if all the stuff we did in The Matrix Reloaded and Revolutions could be done, not in the same style but better, on a modern gaming console. Lana and Warner Bros. thought it was a great idea to introduce The Matrix IP to a whole new generation of gamers and audience members.”  

Kim Libreri behind the scenes on an LED stage virtual production shoot 

Nanite, one of the standout features of Unreal Engine 5 is a result of Brian Karis, a senior graphics programmer at Epic Games, attempting to solve how to use micropolygon geometry in the same way that RenderMan was used for visual effects in the old days. “It would allow the movie and game industries to merge together,” says Libreri. “For years Brian kept trying ideas and eventually, with the Nanite technology, he cracked it.”  

Other members of the Unreal Engine team were inspired to produce additional features that include the global illumination system Lumen, a new World Partition system for building massive open worlds, improved AI, and MetaSounds, a procedural audio engine for controlling all aspects of audio rendering.  

“All of this was driven by trying to match the quality of a movie as best we could. This idea that an asset can be built the way you make it for a movie and just load it into the engine is transformative. Most virtual production teams and virtual art departments have to deal with the fact that game assets are different from movie assets. You have to generate normal maps and build lower resolution polygon meshes from your higher resolution meshes; there are extra steps involved, especially for environment art that is going away because of Nanite.”  

Kim Libreri behind the scenes on a shoot for The Matrix Awakens: An Unreal Engine 5 Experience 

As more tools get added it is important to make sure they do not conflict with each other. “Unreal Engine’s primary purpose is to make awesome games and entertainment experiences, and our customers need to be able to advance their projects in a unified way,” says Libreri. “We try to avoid having multiple variants of the engine. Within the company there are a bunch of us who have to think about the side effects and dependencies every time we add new features. Unreal Engine works as well as it does because we actually put our new technology through very complex production paces ourselves before releasing any new in-engine tools.”   

What does the future hold for Epic Games and Unreal Engine? “As we head into the metaverse, we want to show that there doesn’t need to be a barrier between industries. If you are a filmmaker and use our engine to make content for an LED wall, then that content can also go into a game, and digital characters created for those experiences can be transferred to other mediums. We’re always thinking about ways to make sure there’s a fair world for creators to exist in, and that they have all the tools in place to be successful. This extends beyond Unreal Engine and into MetaHuman Creator, Quixel MegaScans, all of the free assets and sample projects, training, and Epic MegaGrants which have been awesome for pushing the envelope in advancing new creative. This is a transformative time in the film industry and it’s very exciting to be part of the evolution fueling next generation entertainment.” 

Kim Libreri

Related Posts

Related Articles