01 Mar 2017

Just announced: How the Unreal Games Engine could change the way we make films

  • Written by 
  • submit to reddit  
Rendered in realtime: The car is effectively a realtime skin over an adaptable chassis Rendered in realtime: The car is effectively a realtime skin over an adaptable chassis Epic Games

Index

‘The Human Race’ is a new short film created by Epic Games, The Mill and Chevrolet, which celebrates the 50th birthday of the Camaro but whose realtime CG techniques could go on to revolutionise filmmaking.

It was always going to happen. Games engines get better and more photo realistic. Films require increasing amounts of CGI. Filmic CGI takes a long time to render. Games engines render in realtime. What’s not to like? 

Well, quite a lot. Computer games don’t look like real life, and they often have different goals and priorities to filmmakers. Games are interactive and largely unpredictable (although they operate within closed worlds). Films are the opposite of realtime. Almost every pixel is planned to the nth degree. Rendering can — within limits — take as long as is needed to give a rich and convincingly photorealistic result. This is how it was possible to make a completely CGI film in the 90s without it looking terrible — it just took a very long time on the computing kit available at the time. 

Meanwhile, computer games are approaching photorealism. They do this in a number of ways and it’s not just a matter of more pixels. Good games take into account lighting (there’s a million different aspects to this) and physics: the ability for a CGI object to have real-world properties and to act according to those characteristics when interacting with other objects. So, for example, if a speeding car crashes into a bus shelter, the street furniture would fall over or break in a realistic manner.  

It’s very hard to write a game, still less a successful one. While there’s no sure-fire way to guarantee that a game will make money, there are ways to make it easier to create one. Chief amongst these is the games engine. 

A games engine is a software component that does much of the heavy lifting needed to create a 3D environment in a game. It handles the shading and the physics. Engines take years to develop, and evolve slowly over time to become something quite incredibly powerful. With a good engine, developers can bring new material into games much faster than if they had to develop the whole technology from scratch. 

Modern games are now pretty incredible. They’re remarkably more sophisticated than their predecessors. Sometimes you can even think you’re watching a film. 

And that’s the point. Film and games are two areas of technology that are made for each other. The ability to render cinema quality objects and scenes in realtime will transform movie production. 

And it’s starting to happen. 

Unreal from Epic Games is one of the most detailed and convincing gaming engines available on the planet. Each iteration brings new imaging capabilities.

The engine is very adaptable and relatively easy to interface with the outside world. Which is what appeared to internationally renowned post and VFX facility The Mill. 

The Mill has been working for a long time on a project to make it possible to simulate any car in a scene, mixing a CGI car with reality. Apparently it’s really hard to get cars for shoots. They might be unavailable, or might simply be too valuable to be put at risk in a film.

Their initial (but limited) success encouraged them to look at ways to make the process better, which is to say more automatic; needing far less manual intervention. 

Their idea, in outline, was to create a “mule” car that could drive on real roads, and which had adjustable properties. It’s not pretty to look at: somewhat like a Batmobile with tracking markers and some strange appendages sticking out above the roof. 

The markers allow VFX artists to understand the movement and orientation of the car, accurately enough to superimpose a “skin” of a CGI model. Ideally, this needs to create a photographically accurate moving image — in realtime. 

Less than a year ago, the Mill started talking to Epic Games about using Unreal as their realtime engine. Epic immediately saw how it could help and in the process create an original and ground-breaking technique that was likely to send ripples through the fabric of filmmaking. 

Unreal is able to create photorealistic images of cars, complete with physics and lighting, in realtime. But in order to do this it needs information. Lots of it. Gigabits per second, in fact. 

The motion tracking mule is called, not without some dramatic intent, ‘Blackbird’. It’s a fully working vehicle that’s adjustable by up to four feet in length and ten inches in width. 

shot_03_blackbird_exterior.jpg

The 'raw' Blackbird

That part was relatively simple, but the challenge remained: how do you create cinematic quality augmented reality (i.e. mixing a CGI element with the real world in real time). 

The proboscis sticking out of the top of the Blackbird houses 360 degree data capture equipment, including depth-sensing LIDAR (like you find in self-driving cars) and four RED cameras (which you don’t typically find in self-driving cars, although it would be cool). 

The combination of the 360 degree depth information from the LIDAR and the incident lighting information from the REDs is enough for Unreal to work on shading and lighting the car as a function of the surrounding environment. This is all transmitted wirelessly to a chase car. Directors and cinematographers can see the CGI car in realtime, comfortably in context in a real location. 

The volume of data transmitted from Blackbird, and the amount of computing needed to do all this in realtime is mind-blowing but not unreasonably so, because Unreal runs on even consumer-grade PCs as long as they’re fast enough and have a good enough GPU. 

What does this mean for the film industry? Ultimately, a lot. While the Blackbird/Unreal demonstration is a tour-de-force, it is still a rather special case. But there is absolutely no reason why this technology (i.e. a tracking “mule” and a system to feed back realtime depth and lighting data to Unreal) shouldn’t be adapted and applied in many other cases. 

At the very least, it’s an incredibly good way for Directors to visualise CGI elements in real-world contexts in realtime. You can imagine feeding the Pro AR images to a handheld monitor that effectively becomes the viewfinder of a “virtual camera”. 

And beyond that, as the use of CGI in films continues to grow, there will be more and more need to see the results in realtime. It’s even possible that games engines will be the primary technology for making animated and mixed reality films. 

Ultimately the threshold is: does it look convincing to a typical filmgoer?

It looks like, in this instance, that threshold has already been crossed. 

Have a look at the short below and you can read the full press release over the page.



« Prev |


David Shapton

David is the Editor In Chief of RedShark Publications. He's been a professional columnist and author since 1998, when he started writing for the European Music Technology magazine Sound on Sound. David has worked with professional digital audio and video for the last 25 years.

Twitter Feed