Beeble and The VPX Lab used NAB 2026 to show a portable virtual production pipeline that skips LED volumes and keeps creative decisions open through post.
Beeble and The VPX Lab demoed an interesting portable virtual production workflow in Vegas running off-site on a hotel terrace, using laptops from Puget Systems. It combined AI relighting, real-time environments, and virtual cameras into a single continuous pipeline, totally eliminating the need for LED volumes or greenscreen.
The idea is to collapse previs, production, and post into one iterative loop, and make the whole thing portable enough to run on location without fixed infrastructure. Easy, eh?
The pipeline moves through four stages. Scenes are first blocked in virtual environments built with tools such as Terra from District Cinema and Gaussian-splat environments from XGRIDS and WorldLabs. Directors then develop the shots, rehearsing camera moves, staging, and performance in real-time using virtual cameras, markerless motion capture, and environments built in Unreal Engine.
Live-action footage is then captured on location with cinema cameras and spatial tracking. Without the need for LED volumes or greenscreen, the setup remains lightweight and portable, while still capturing the data required for further work downstream.
Finally, that footage is processed through Beeble's SwitchLight and SwitchX tools, allowing lighting, environments, and visual context to be reshaped after the shoot rather than locked at the point of capture.
It's not the holy grail yet where previs and post just meld into a single overall process of real-time production, but it's a definite step on the way.
"What's changing is that production is no longer the point where decisions get locked in," says Conrad Curtis, Production and Partnerships Lead at Beeble. "You can carry creative intent all the way through the process — shaping light, environment, and performance even after the shoot."
To find out more about this workflow, visit beeble.ai and vpxlab.com.