<img src="https://certify.alexametrics.com/atrk.gif?account=43vOv1Y1Mn20Io" style="display:none" height="1" width="1" alt="">

Three technologies that will change the future of virtual production

Image: FrameStore, All of it Now, XR Stage.
4 minute read
Image: FrameStore, All of it Now, XR Stage.

In this exclusive interview we spoke with CTO of disguise, Ed Plowman, about the technologies he predicts will revolutionisze virtual production sets for the future.

When COVID-19 put a pause on all filming two years agolast year, virtual production (VP) came to the rescue – allowing cast and crew to safely create new feature films, TV, and content for streaming platforms.

Today, LED-based virtual sets have become widely known as an effective solution for capturing multiple locations without needing to travel, as well as shooting in-camera VFX without a green screen – even on a low budget and with a tight shooting schedule.

As virtual production develops, enjoying a predicted annual growth rate of 15.8% from 2020 to 2027, where can we expect to see the technology go next?

Ed Plowman, CTO of disguise has a few ideas. Backed by Epic Games, disguise is a software and -on-hardware platform that pushes boundaries in virtual production. Together with his team, Ed has recently completed a UK government-funded research programme received UK government funding to develop disguise’s workflow into the most integrated and robust solution for virtual production on the market.

Read on to learn his insights into how this incredible technology will continue to change creative workflows in the future.

Image: Savannah College of Art and Design.
Image: Savannah College of Art and Design.

ACES and HDR Reproduction

“We’ve only just scratched the surface of what the current LEDs can do,” begins Plowman. For him, the first big change we’ll see in VP technology is an industry-wide integration of ACES and HDR reproduction into existing production workflows.

Until recently, the LED world was primarily for the live entertainment market, so ACES and HDR weren’t major concerns. As virtual production started shifting towards entertainment markets that need high-end production values, like film and VFX, this soon changed.

“For me, this was first highlighted when disguise began interacting with high end visual effects companies like Framestore and Double Negative,” Plowman confirms. “These companies immediately asked us how they could manage colour and HDR pipelines with our platform.”

The problem, according to Plowman, is uniformity. “Many LED elements are mass produced and primarily targeted at jumbotrons,” he adds. “For film and broadcast, we're obviously using them in a very different medium. The great news is, LED technology has an infinite reproduction range, so the industry has a golden opportunity to develop new solutions.”

According to Plowman, the future will see the introduction of many different LEDs for different use cases including budget, portable and even foldable options. “The use of light field technology in virtual production will probably happen at some point too, depending on cost and complexity,” he adds. Light field technology captures information about the light field emanating from a scene rather than just the light intensity itself.  

Of course, the hardware is only half the story. To ensure accurate colour reproduction, the way cameras capture rushes from the LED screens and the way real-time software interprets that data needs to change too.

“In a VP stage, you’re outputting data from a 3D programme with a fixed colour space through to an LED wall. The LED wall portrays colours that can slightly change dependent on your point of observation. This footage is then captured by cameras that have their own internal colour reproduction as well. Then you're introducing physical set items as well, like people and objects in the real world, which are lit by physical lighting,” Plowman explains. Industry-standard colour management pipelines like ACES help with this, which is why we implemented ACES into our Designer software. latest software launch.

“There’s a lot of complexity there. As an industry, we need to work out a way for there to be a trade-off between camera settings and 3D settings, along with improving the accuracy of LEDs.”

Image: Final Pixel.
Image: Final Pixel.

Nanosecond Calibration

Faster, better calibration is another area Plowman predicts will hugely improve on future virtual production sets. In fact, it’s a major focus area for him and his team at disguise. Currently, one of the team’s main goals is to ensure calibration is made more dynamic, even to the nanosecond level.

“We want VP systems to be able to calibrate the position of a piece of content, then adjust it according to the position of the camera - all with minimal latency. This means LED images will always look accurate from the camera’s perspective,” Plowman explains.

“To do this, we’re talking to processor manufacturers, camera manufacturers and tracking providers to try to break the hold of Genlock – the synchronisation mechanism that’s currently the most commonly used – and move it over to a PTP network. That will bring us nanosecond timing.”

With nanosecond calibration, anyone running a VP stage would see their setup times massively reduced, as well as benefit from faster camera tracking. “This is a game changer in terms of creativity,” Plowman adds.

“With nanosecond calibration, I can experiment with opening my camera shutter a little longer or pushing the nits of the LED wall up. I can display and capture true motion blur from my LED wall.”


Image: CVP.

Artificial Intelligence

The rise in AI has now had an effect on a huge number of media and entertainment industries – and VP is no different.

“AI models of camera behaviour will likely transform future virtual production sets,” reveals Plowman. “If you develop smart systems that can predict what the next camera move is going to be, that’s going to hugely improve the speed and accuracy of camera tracking and calibration.”

With the right tool, says Plowman, this can all be done in pre-vis, so that the camera moves are prepped in seconds without needing to step foot on the LED stage. Considering virtual production tends to involve more work up-front than the standard shoot (leaving minimal post production work), having AI help speed up the preparation process will have huge benefits.

“So say I have an ARRI LF Mini in a virtual environment and I can do most of my pre-visualisation and scouting work with AI,” Plowman explains.

“I could even use digital doubles to stand in for actors. That means all of the pre-shoot checks can be done without tying up talent time. I can understand what I want to get from an LED, and whether it’s going to work or not faster than before– setting the rest of production up for success.”

Tags: Production VR & AR Virtual Production

Comments