<img src="https://certify.alexametrics.com/atrk.gif?account=43vOv1Y1Mn20Io" style="display:none" height="1" width="1" alt="">

Mixed reality: mixing photo real CGI with live footage, in realtime

4 minute read

The Future GroupThe Unreal gaming engine brings photo real live graphics to broadcasting

New ultra-realistic mixing of live footage with CGI is now being made possible by what was once a dedicated gaming engine.

FremantleMedia, producer of The X Factor, the ‘Idols’ reality-TV show format and The Apprentice, may have another hit on its hands — one which may have cracked open a whole new media experience. It’s basically integrating advanced virtual TV studio technology with games engines and mobile distribution. It merges the physical with virtual worlds. To put it another way, this is genuine Mixed Reality.

Game show Lost in Time is the first concept co-developed by FremantleMedia and The Future Group (TFG) using the technology platform developed by TFG . To date TFG has invested 463 million NOK / US57 million in the platform.

The programme premiered in Norway earlier this year and has just had its first international sale to an Emirates broadcaster which will adapt it for distribution across 22 countries in the Middle East and North Africa.

In the game itself, contestants compete in different challenges in a green-screened studio show against the backdrop of six virtual worlds (Wild West, Ice Age, Medieval Age and the Jurassic Period, etc). The contestants and props are real, but everything you see on screen is made up of visual effects to a standard of which the makers claim it was only previously possible on Hollywood movie budgets. Even better, the VFX are real-time capable in a full multi-cam setup.

The big departure from traditional game shows, though, isn’t just the graphics. What’s unique is that viewers watching at home are also placed into the same virtual environment and are able to participate in exactly the same story as those on TV and to compete against studio contestants via a mobile or tablet app.

At the moment, VR headsets aren’t distributed widely enough to justify a primetime viewing slot for a live show. That’s why TFG’s content officer Stig Olav Kasin says it developed the games for iOS and Android devices, giving a large global audience the chance to compete and engage with the content. “However, once VR headsets are more widespread, it will open up a new world of possibilities for the TV industry to blend the best elements of gaming and traditional TV,” he says.

Signs of this are already evident. Partnered with the Turner-IMG ELeague, TFG injected CG characters into the broadcast of the Street Fighter V Invitational eSports event last spring. Characters from the game struck fighting poses on the studio set, viewable by studio audiences on adjacent screens and by Turner’s TV audience. It took a couple of months to produce the characters but the resulting animations were displayed without post production combined with physical sets and presenters, live.

TFG was at it again for the Eleague’s Injustice 2 World Championship broadcast on TBS, Twitch and YouTube (which began last month and continues until November 10) from Atlanta, Georgia. Among the 3D character animations presented to viewers at home as if interacting with the studio audience, was Batman. This promises to be the first of a wider deal to augment more superhero characters from the Warner Bros stable in mixed reality.

TFG co-founder Bård Anders Kasin was a technical director at Warner Bros during the making of The Matrix trilogy when he came up with the initial idea for the mixed reality platform.

_CN14095-650.jpg

A new Frontier

The technology platform underlying TFG’s MR format was developed with Canadian broadcast gear maker Ross Video and is being marketed as a standalone software application by Ross.

Branded Frontier is promoted as an advanced form of a virtual set for the creation of photorealistic backgrounds and interactive virtual objects.

At its heart is the Unreal gaming engine, from Epic Games, used as the backdrop renderer of scenery through features such as particle systems, dynamic textures, live reflections and shadows and even collision detection. This works in tandem with Ross’s XPression motion graphics system, which renders all the foreground elements.

Of course, games engines were never designed to work in broadcast. Unreal, or the Unity engine, is superb at rendering polygon counts, textures or specular lighting as fast as possible on a computer. They do not natively fit with broadcast signals which must correspond to the slower frame rates of SMPTE timecode. However, when it comes to rendering performance, game engines are a real step ahead of anything in a conventional broadcast virtual set.

It’s the difference between a few milliseconds and anywhere from 25 to 50 frames a second.

What TFG and Ross have done is to re-write the Unreal code so that the framerates output by the games engine’s virtual cameras and those recorded by robotic studio cameras match. They have succeeded in putting photorealistic rendering into the hands of broadcasters. The virtual worlds are created in advance with features like global illumination, real-time reflections and real-time shadow and rendered live, mixed with live action photography.

Even this would not be possible without the performance of GPU cards from companies like NVIDIA.

According to Kasin, the biggest challenge now for content creators is developing an MR storytelling structure suitable for TV. “Viewers are used to a linear 2D story,” he says. “Creating a unified experience where people can move around freely like in a game simply isn’t possible, or at least no one has cracked it yet,” he says. “The danger for content creators is that they fall into the trap of making the same stories we have today, simply with new VFX.”

He advises show creators not to get overly focused on the new technical possibilities — “a trap into which many 3D productions have fallen” — but to remember that a good story needs real human drama and emotion.

Engagement is one thing, but Fremantle is also promoting the format’s advertising potential. Product placement could simply be ‘written into’ backdrop animations designed to mimic the virtual environment (think of a Pepsi logo styled to fit a saloon in the Wild West). Commercials could also be created in Unreal Engine so that viewers need not feel they are leaving the show's virtual universe.

Nolan Bushnell, the founder of games developer Atari Corp. and a consultant to the TFG project, claims that the fusion of gaming with TV can "bring a standard construct for new kinds of entertainment."

Other format sales of Lost in Time are pending, while TFG believes the tech’s potential has barely been explored. "What we are producing with now is like the first smartphone," says Bård Anders. “There will be a natural progression of this technology.”

Tags: VR & AR

Comments