<img src="https://certify.alexametrics.com/atrk.gif?account=43vOv1Y1Mn20Io" style="display:none" height="1" width="1" alt="">

How The Midnight Sky extensively used virtual production to recreate impossible environments

George Clooney in The Midnight Sky, which extensively used virtual production. Image: Netflix.
3 minute read
George Clooney in The Midnight Sky, which extensively used virtual production. Image: Netflix.

Released on Netflix in 2020, sci-fi drama The Midnight Sky made extensive use of virtual production technology to tell a story set in two harsh and alien environments: deep space and the Arctic.

The Midnight Sky was directed by and stars George Clooney, with Martin Ruhe behind the camera and Matt Kasmir supervising the visual effects. Based on the novel Good Morning, Midnight, the film follows Clooney’s isolated Arctic scientist as he tries to warn the crew of a Jovian space mission that they are returning to a post-apocalyptic, radioactive Earth.

Virtual production technology was used from the outset to previsualise key scenes, including a spacewalk and a sequence in which a habitation pod sinks into freezing water. 3D models of the respective environments were built and populated with rough virtual humans. Moving around a studio space of the appropriate size demarcated with white tape, Ruhe could see the virtual scene on his iPad. When he and Clooney had selected their shots and recorded them, editor Stephen Mirrione cut them together. After reviewing the edit, the team could record additional or alternative virtual shots and reiterate until they were satisfied.

Virtual production sequences

When it came time to shoot the “real” spacewalk, a partial mock-up of the craft was built without any texture or detail, and surrounded by black drapes. Some characters were portrayed by actors flown on wires, while others were CGI with performance-captured faces. Kasmir’s team also inserted the finished spacecraft and added stars to the black backgrounds.

The pod sinking sequence was shot partly on an ordinary soundstage and partly on an underwater stage. For the former, although Arctic backgrounds would be inserted in post, the filmmakers chose not to shoot against green screen. Instead gaffer Julian White rigged a 280x40’ grey screen of the kind normally used for rear projection, and lit it with numerous ARRI SkyPanels. By altering the settings on the SkyPanels, he and Ruhe could obtain exactly the right shades of blue-grey light needed to represent an Arctic nighttime exterior.


Felicity Jones goes full virtual production in The Midnight Sky. Image: Netflix.

This was far from the only scene in which the crew took pains to get the most realistic lighting. When the astronauts had to be surrounded by boldly-coloured holographic imagery, Ruhe’s team hung LED ribbons emitting the appropriate light; the LEDs were subsequently painted out and the holograms added.

To show the views out of the spacecraft windows, ILM’s StageCraft LED Volume at Shepperton Studios was employed. 1,400 ROE Visual Black Pearl panels with a pixel pitch of 2.8mm displayed real-time backgrounds that could be captured in camera, with 264 additional 3.47mm panels providing additional interactive light from off camera.

This system was also used extensively for earthbound scenes in the Arctic research station. Given that the station featured very large windows, the realistic reflections and lighting provided by LED screens made it a far more attractive choice than traditional green screen.

A plate crew first shot backgrounds in Iceland using a three-camera rig of Alexa 65s. These were processed by ILM to produce a 3D virtual environment in Unreal Engine 4. During principal photography, witness cameras tracked the movements of Ruhe’s Alexa 65 within the station set, feeding the data to Unreal where it was used to render the correct perspective and parallax on the screens. The StageCraft team could change the virtual sky, time of day and weather conditions at will.

“Only the section of the screen that can be seen by the camera moves,” notes Kasmir in a behind-the-scenes featurette. “The rest of it remains rock solid at a lower resolution so that it can continue to light the set.”

Reality and illusion were seamlessly mixed in a snowstorm sequence. The first unit travelled to Iceland to record the bulk of the sequence on location, some of it during a real 50mph snowstorm. Later, the balance of the scene was captured on stage at Shepperton. The crew’s experience of the real thing – to the extent that Clooney’s eyelids often froze shut – led to a more extreme studio recreation than they had planned.

In an ASC Clubhouse Conversation, Ruhe recounts how one of the Icelandic crew was fooled when he visited the snowstorm set: “He looked at an image on the monitor and he said, ‘Yeah, we shot that on that day,’ but no, we shot that this morning here.” 

Shortly after The Midnight Sky wrapped, the COVID-19 Pandemic began and virtual production – already heralded by some as the future of filmmaking – suddenly accelerated in development. There can be little doubt now that it’s here to stay.

virtual-production-banner-overlay-v2 1

Tags: Production Virtual Production

Comments