<img src="https://certify.alexametrics.com/atrk.gif?account=43vOv1Y1Mn20Io" style="display:none" height="1" width="1" alt="">

The URSA Mini’s best kept secret?

2 minute read

Blackmagic Design You spin me round: How significant is the URSA Mini's gyroscope?

There has been a lot of focus and attention on the new Blackmagic URSA Mini and its abilities, such as a 15 stop dynamic range, well thought out ergonomics and lighter weight than the full size URSA. However one of its hidden talents may also be one of the most ground breaking, especially given the company's acquisition of the Fusion VFX software.

I have written more than once about the way that cameras could use additional capabilities such depth field information if they could record it. There are now cameras on real world products that can do this and are using that information in the ways I suggested, such as being able to recreate a scene in 3D, produce stereo 3D results, matte out objects and background, as well as adjusting focus. This is now a reality, and hopefully before long all cameras will be able to record this data. I also admit to feeling rather smug about this since I was told a number of years back by more than one industry expert that such functionality could never be made from depth information, and that it would be a pointless thing to pursue! But I digress…

While the URSA Mini cannot record depth information there is one other aspect that I have commented upon that does appear to have made it into URSA Mini. Orientation metadata.

The URSA Mini houses a gyroscope and a GPS to not only record world positioning data, but the orientation of the camera, too. This is an interesting addition and has some rather useful potential uses, depending on how Blackmagic has implemented it.

If the metadata is only recorded once per shot than this may be a dead duck. However if the gyroscopic information is recorded throughout the video file this may have profound uses in post production.

VFX software that can read such data will be able to use it to easily place 3D graphics within a shot environment, even if filmed in shake handheld style, without the need for cumbersome match move tracking. Such data could also be used to help with shot stabilisation. In actual fact RedShark covered an innovative add on device for cameras called SteadXP previously, which produced some stunning post production stabilisation results by using such data. As a diversionary point, combining gyroscopic information with a depth map would have the effect of being able to minimise perspective warping with such processes as well, but that’s for the future!

As I mentioned earlier, the usefulness of this metadata will depend greatly on how Blackmagic has implemented it. But if they have indeed enabled realtime gyroscopic data to be captured this could be very useful for VFX producers. It is tantalising that the wording of the press release states “…Built in gyroscope allowing recording of camera pitch, roll and yaw movements when working in RAW.”

The key word there is “movements”, which indicates that the camera may well indeed record realtime information. Interestingly it also states specifically that this is a function of recording in RAW, which may hint that it is aimed at being used in extensive post.

If nothing else this is a glimpse into the thought process of Blackmagic, who appears to be considering their products very much as a full system that can be used from production to release. Much like Apple, but unlike other camera manufacturers, the fact that Blackmagic now have some very popular cameras on the market as well as very popular and capable software, means that they can introduce innovative camera capabilities and ensure that people can take advantage of them very quickly and seamlessly.

Tags: Production

Comments