<img src="https://certify.alexametrics.com/atrk.gif?account=43vOv1Y1Mn20Io" style="display:none" height="1" width="1" alt="">

RAF Virtual Experience: This is what it takes to edit 360 video

Stitching together 360 footage is incredibly resource intensive. Image: Image: UK MOD © Crown copyright 2021.
3 minute read
Stitching together 360 footage is incredibly resource intensive. Image: Image: UK MOD © Crown copyright 2021.

Last time Tony Newton and Paul Gale of the RAF Media Reserves detailed how they put together a complex 360 experience shoot. But that was only half the battle. This time they take us through what it took to put it all together in post production.

Here's a link to part 1 in case you missed it.

Capturing the 360 content was only half the battle - perhaps less. To put some stats to the project, with six 4K camera streams at 120Mbps plus the audio data recorded, we captured some 1.5Tb over five days of shooting, downloading to backup each evening. That evening backup and reset ritual is a reminder that battery life is a rate limiting step in 360 filming as in any other. The Insta360 Pro 2 battery only lasts around 50-60 minutes, so for this project we had to source several 12V battery packs capable of lasting 3-4 hours.

Editing beast

It took over 250 hours editing using a high-end AMD Ryzen 5900X 12 core PC with 64 GB RAM, RTX 2070 Super and 2TB NVME storage to produce the final seven-minute end product. The PC was bought specifically for the project and was a balance of power vs. cost as we had a limited budget to buy all the kit we needed. The software used at various stages of the edit included Mistika VR stitcher, Premiere Pro, After Effects, Reaper Digital Audio Workstation (DAW) and the Facebook 360 spatial workstation plugins.

An additional issue was the need to optimise the viewer experience by matching the 5.7K footage from the Insta360 One R with dual-lens 360 module with the 8K produced by the Insta360 Pro 2. Whilst the One R is a great little action camera, the Pro 2 produces much better images.


Putting together the audio using a spatialiser. Image: Image: UK MOD © Crown copyright 2021.

Orientation

To ensure that the viewer would always be looking in the right direction at the start of the video to avoid missing any action, we added a ‘look in this direction’ cue that doubles as an opportunity to settle into the headset for viewers that haven’t experienced VR before. This was produced using a futuristic looking 3D CGI projector, composited over the top of an empty 360 shot in the C-17 hold. This was created in After Effects using the Video Copilot Element 3D plugin and Red Giant’s Holomatrix plugin for the holographic elements. A lot of work but we think very effectively sets the scene for what’s to come.

As a new and rapidly developing technology, there are many standards or formats for 360 delivery - Facebook, YouTube VR, Oculus TV and sideloading direct to the device all require something slightly different to give an optimal experience (or to run at all). This all adds to the processing time overhead that can easily be overlooked when planning and costing a project.

In this case, we captured stereoscopic (3D) 8K video at 30fps and stitched to 8K monoscopic, outputting to ProRes 422 to help performance while editing (the highly-compressed master MP4 files certainly wouldn’t play smoothly, even on a high-end PC). We decided that, on balance, the monoscopic footage gave the best result vs the additional editing time and difficulties stereoscopic would have given. The audio from the Zoom H3-VR, mono channel radio mics and sound effects were mastered in Reaper DAW as third order ambisonic mixes (16 channels). This allowed us to output final versions at first or second order ambisonic as required and video at whatever resolutions were needed for a specific target channel. For smoothness of edit, source files were transcoded to Apple ProRes 422 and all intermediate files used this format (e.g. noise reduction, object removal, motion blur and visual effect passes).

360 or not?

This article should convince you that immersive 360 filming is time consuming and fraught with ‘elephant traps’. Both of us agree that this first foray into immersive video has been one of the most rewarding of our RAF Reservist and civilian careers. Seeing a viewer’s “OMG!” reaction to an immersive product you’ve put together is a real buzz, and even more so when that reaction is repeated time and again with each new viewer.

The ability to get the VR viewer ‘up close and personal’ to assets and activities that would otherwise be closed to them is a creative, PR and communications dream that will mature as VR headsets become cheaper, more widely used and the demand for quality content soars.

See how the full experience was put together with the BTS video below.

View the full VR experience on YouTube with this link, or click on the image below to be taken to the Oculus TV version.

Tags: Post & VFX VR & AR

Comments