<img src="https://certify.alexametrics.com/atrk.gif?account=43vOv1Y1Mn20Io" style="display:none" height="1" width="1" alt="">

Zero Density brings real-time ray tracing to virtual studios with Chaos

1 minute read
Zero Density brings real-time ray tracing to virtual studios with Chaos
3:00

Zero Density and Chaos are previewing real-time ray tracing inside a live virtual studio pipeline at NAB 2026, while ZD also introduces AI newsroom tools, Gaussian splatting, and automatic lens profiling.

Zero Density and Chaos have announced an R&D partnership that integrates Chaos Vantage's real-time ray-tracing engine with Zero Density's Reality 5 virtual studio production platform. Chaos Vantage handles real-time ray tracing while NODOS — Zero Density's node-based compositor — delivers the virtual studio and XR toolsets. The preview runs on both green screen and XR video wall setups.

Reality 5 has been redesigned to support multiple render engines — Chaos Vantage, NVIDIA Gaussian splatting, and Unreal Engine 5 — giving production teams a choice of rendering path without changes to the core production pipeline. Vantage is built on the same technology base as V-Ray, which has an Emmy Award to its name, and brings fully ray-traced rendering to live broadcast environments.

Kuban Altan, Co-Founder and CTO of Zero Density, described the partnership as research into applying ray tracing to virtual studio production — with NAB as the first public outing for the combination. Vlado Koylazov, Head of Innovation at Chaos, noted that Vantage was designed to deliver ray-traced rendering without the overhead of conventional game engine pipelines, opening the technology to broadcast professionals working in live environments.

More from the ZD booth

Beyond the Chaos partnership, Zero Density's NAB showcase includes several additional announcements.

Alongside Chaos Vantage, the company is demonstrating NVIDIA's Gaussian splatting renderer on both green screen and XR stage. Gaussian splatting renders complex, photorealistic environments in real time with greater computational efficiency than traditional 3D reconstruction.

Elsewhere, Reality Hub gains AI-assisted newsroom tools ahead of a release later this year. Designed for journalists and producers, the tools cover proofreading, summarization, text extraction, image description, and translation across 100+ languages. For live news, they also enable real-time transcription and subtitle translation using Google's Gemma 4 model, running locally on a Reality Hub server.

Reality 5 will also add automatic 3D color profiling for XR video walls, generating profiles within minutes rather than through manual processes. The feature matches set extension colors to in-camera capture. This is being shown in private sessions only.

The forthcoming Traxis Hub release introduces an Automatic Lens Profiler, allowing broadcasters to build zoom lens profiles in minutes rather than hours. There are also updates to Lino, the company's template-based broadcast graphics workflow for news and live production, which adds Journalist Preview with MOS plugin integration, and a dedicated Ticker Module.

Tags: Virtual Production Zero Density Chaos NAB 2026

Comments