<img src="https://certify.alexametrics.com/atrk.gif?account=43vOv1Y1Mn20Io" style="display:none" height="1" width="1" alt="">

disguise inks collaborations with NVIDIA and Move.ai

1 minute read

Virtual production is going to be a big story at NAB Show 2023 and disguise has some interesting strategic moves to talk about.

A couple of announcements from disguise to kick off the week.

First up is a new collaboration with NVIDIA to integrate NVIDIA Omniverse with disguise’s platform. This is interesting and is going to allow users to connect their preferred digital content creation tools —  Maya, Cinema 4D, 3ds Max and more — in a unified production pipeline for 3D visualisation and virtual production. This should allow in turn for easier and quicker changes, enhanced content production, the ability to work in full fidelity, and a lower barrier of entry for accessing media and entertainment workflows.

The integration builds on disguise’s existing RenderStream capabilities, which is a bi-directional protocol transporting rendering information between third-party render engines and disguise.

“This is the rising tide that lifts all boats,” says Raed Al Tikriti, Chief Product and Technology Officer at disguise in a nice turn of phrase. "The integration with NVIDIA Omniverse connects content creation tools to industry-leading real-time engines such as Unreal Engine, and opens up future avenues for connecting AI-assisted content creation workflows that are evolving rapidly and taking the industry by storm.”

Move-ai partnership

We wrote about markerless motion capture technology provider Move.ai fairly recently, and it’s good to see them nabbing commercial successes just after launch. This is a bit of a no-brainer really, the partnership marrying advanced markerless motion capture with disguise’s graphics processing, and to make it happen disguise and Move.ai are developing a custom AI technology dubbed Invisible.

Move.ai’s tech works by extracting natural human motion from video using AI, computer vision, biomechanics and physics to automatically retarget the data to a character rig and create a virtual character that can mirror human motion in realtime. The idea is that Invisible integrates with the scalable processing capabilities of disguise hardware, with motion capture data directly integrated into creative workflows in the disguise Designer software. Meanwhile, disguise’s RenderStream protocol ensures the transfer of skeleton data across the disguise Unreal Engine rendering cluster, allowing for greater synchronicity of content and tracking data across the production workflow and the seamless merging of the physical and virtual world.

It will be interesting to see it in action. Public availability is currently slated as May 2023.

Tags: Virtual Production