My feet hurt!



Home » Features » Opinion » Letter From America » My feet hurt!

My feet hurt!

BY: Steven Poster ASC

As the 2024 NAB Show lights up Las Vegas, Steven Poster ASC enjoys an illuminating experience at the city’s newest draw, the Sphere.  

I just returned from Las Vegas and this year’s version of the NAB convention. As many of you already know, this is one of the largest venues for a ‘new and improved’ equipment convention. Throughout the year we are given several important times for companies to display and demonstrate the latest in tools and systems to make our working and creative lives if not easier at least better.  

This year was filled with lenses, lights and a lot less AI than I thought we would see. But first, a small select group got to see a spectacle that I never thought would impress me as much as it did.  

Andrew Shulkind, head of capture and innovation for MSG Sphere Studios, and his cohort invited a group of technologists to the Sphere on the strip to experience the show on the largest LED screen in the world with a capacity of up to 20,000 viewers. We were tucked away comfortably in a row of deluxe VIP suites with libations and three rows of seats equipped with haptic units that made the seats move and shake to match the action on the screen, adding an extra layer of excitement to the presentation. 

The dome is huge, with a number of floors of seating. The show that was playing was a kind of futuristic travelogue directed by filmmaker Darren Aronofsky and  cinematographer Matthew Libatique ASC LPS. Libatique shot the opening and closing pieces, and much of the travelogue was shot with an amazing camera called Big Sky and lenses that Andrew Shulkind had a hand in developing. This camera produces images that are 316-megapixels, using a 3×3-inch HDR image sensor that captures 18K x 18K images capable of going up to 120 frames per second.  

To enhance the images further this theatre produces atmospheric effects like winds that mimic what you might feel if you were standing in a desert or next to a waterfall. When I first heard about these effects I was reminded of my experiences as a kid going to see “This Is Cinerama” or even Smell-o-Vision (that really did exist for a moment). Cinerama really worked when you were viewing the rollercoaster ride on the large, curved screen. But Smell-o-Vision never did get off the ground. If you want a laugh read the review of it on Wikipedia. Everything ended up smelling like bathroom perfume or garlic! 

Not so the Sphere. The experience was much more than I expected. It starts off in a standard widescreen format and slowly expands until you are engulfed by images. And it does it in a way that you are almost not aware of the change until it is all around you. The beautiful images of locations that you would rarely see did feel like a journey that was worth watching. The sound, with 168,000 speakers, engulfed the audience in a warm, inviting way. Even the haptic seats were fun but used in an appropriately minimal way. 

What a great way to start our “journey” through the Las Vegas Convention Centre. And over 12,000 steps a day took us through the Central Hall, where all the production equipment and systems were displayed. In two days we never even made it to the South Hall of the convention. And that’s where all the post systems are shown. 

As I mentioned, at least in the Central Hall there was very little in the way of AI. But there were several examples tied to virtual production that were very interesting. In the B&H booth there was a demonstration of Assimilate Scratch’s Live FX Virtual Production and Blue and Green Screen Matting which they have designed into one tool. 

Seeing Assimilate Scratch’s Live FX in action at NAB was like having a full media server on your desktop or portable computer. Live FX can play back 2D and 2.5D content at multiple resolutions and frame rates to a virtual production LED wall from a single machine. It supports full camera tracking, projection mapping, image-based lighting from one tool. It uses native implementation of Notch Blocks and the Universal Screen Description (USD) file format that allows for use of Live FX with 3D environments that can also be used with Unreal Engine and Unity too. It even allows for colour grading on different planes and matte compositing of blue or green screen in real time allowing for set extensions beyond the physical LED wall.  

Live FX can feed any number of lights by pixel-mapping the fixture to the underlying image content, as was seen in the KinoFlo booth where it was integrating their lighting seamlessly. In addition, Live FX can also be brought on location for live composites or for quick previz during location scouting. 

After seeing that system demonstrated at B&H we walked over to KinoFlo’s booth to see the system work with the new Mimic lighting fixtures dancing with light fed through the Assimilate system to a Megapixel Helios processor and then to as many Mimics as you need to make your images come alive. It’s very exciting to see these lights surrounding the stage create moving reflections from patterns generated from the moving background images that are also in sync with those images being generated on the LED wall in front of your camera. 

While attending the convention there are always many parties after the convention hall closes each night, hosted by the exhibiting companies. After walking many kilometres around the halls. I usually am not in the mood for parties. But there was one I wouldn’t miss, and that was the 50th anniversary of Garrett Brown’s invention of the Steadicam System, hosted by Tiffen Manufacturing. The room was packed with Steadicam operators from all over the world who wanted to celebrate with Garrett, who is one of the greatest inventors of our time. He was honoured years ago in the National Inventors Hall of Fame along with people like George Eastman and Thomas Edison. Garrett is a true gentleman and a very kind person who is proud to have given us a toolset that after 50 years keeps getting better. 

Related Posts

Related Articles