<img src="https://certify.alexametrics.com/atrk.gif?account=43vOv1Y1Mn20Io" style="display:none" height="1" width="1" alt="">

NAB Diaries Day Three: The extraordinary things you can do in DaVinci Resolve and a new Fujifilm zoom

4 minute read

Phil Rhodes continues his look around the NAB show floor and witnesses what would as well pass for magic in another era on the Blackmagic stand a sneak look at a new Fujinon zoom.

We made it very clear yesterday that this year’s NAB show involved far less artificial intelligence than some people seemed to expect and desire. Today, however, on the last full day of the 2023 exposition, Resolve’s new AI-powered features conspire to make us reevaluate that assessment.

Like many of the applications of artificial intelligence, the story is of a nearly-fantastic capability that everyone had long recognised as desirable but largely given up as impossible. The automatic transcription features have been discussed extensively elsewhere. Suffice to say, then, that Resolve 18.5 has the ability not only to transcribe spoken material, but also to perform video edits based on an edited version of that transcribed text. It’s a very fast way to rough out the dialogue edit of a documentary, and in many ways harks back to the days when a paper edit was taught in media production classes as the only proper way to begin the post production process. Now, that paper edit can directly guide the actual assembly of a working timeline.

Blackmagic certainly isn’t the only outfit at the show using AI to transcribe natural language. Premiere has very similar things, and if various offerings on the trade show floor are anything to go by, anyone who finds subtitles useful can look forward to an imminent future of many more things being subtitled, especially live television and news, and probably at least as accurately as a typo-prone human being.

Depth mapping

nab2023 resolve depth mapping

It might not look much here, but this is really very clever indeed

Other Resolve AI features are at least arguably showier, though. One of them produces a depth map of the scene, the sort of thing which might previously have required a stereoscopic camera and which would have been noisy and flaky on certain subjects even then. Resolve’s AI-based estimation of the depth map can be keyed and composited in the node tree like anything else, which not only allows for much more realistic simulation of depth-cued effects like fog, but also allows for entire grades to affect only certain depth ranges, which will often make it easier to isolate foreground subjects like people.

A sister feature aims to produce a normal map of the scene, a concept which risks only making instinctive sense to people whose experience includes designing objects and materials in computer generated imaging. The concept of a normal map is that each pixel encodes a direction - a set of three angles - using the red, green and blue values of the image to describe that angle. Applied to a flat object, a normal map won’t actually deform the object’s surface, but it can give a powerful impression of fine surface detail by altering the way the object’s surface reflects light. Viewed obliquely, the lack of real relief is visible, but in most circumstances it’s a great way to simulate, say, a textured metal surface.

In Resolve, creating a normal map allows for some pretty comprehensive relighting, with objects in the scene reflecting new lights much as they would have had those lights existed in the original scene.

Now, let’s be completely fair. This is all great, but scowling, embittered cynics such as your narrator will inevitably point out that the depth and normal maps generated by techniques like these are never going to be pixel-perfect, and the test footage which is being used on Blackmagic’s booth has likely been selected to ensure the new features work at least reasonably well. Even so, depth maps (not to mention normal maps, which most people quite literally didn’t know they needed) have been something of a holy grail for colourists for years, and now it seems to have become a reality. In a piece of software that’s perpetually licensed for $295, it really is exceptionally difficult to argue.

A room with a zoom

nab 2023 fujinon zoom

A mock up of a forthcoming Fujinon 24-300mm T2.9-4.2 zoom lens

Over on the far side of Central Hall are a series of cubicles, each with just enough room for a table and half a dozen people, which are often used by manufacturers who want to run closed-door meetings about products which aren’t quite ready for public disclosure yet. Fujifilm is one of those companies, and we’ve enjoyed more than a few introductions to the company’s upcoming toys in those little rooms at NAB. One of the products which is ready for announcement - if some way from actually being on sale - is an upcoming 24-300mm T2.9-4.2 zoom lens, built very clearly in broadcast format like the much-adored ZK19-90mm and XK20-120mm lenses.

The object you see in the accompanying photographs is not a lens; it’s a mockup, and even the earliest finished versions are months away, not to mention final production examples for review or sale. As such, everything we’re discussing here should be considered on the basis that we don’t know if this thing costs $50 or $50,000. That said, your narrator was also lucky enough to see the MK zooms when they were in this state, and they’re now seen on every other camera all over the world, so the company has earned some legitimacy. To some extent, the idea of Fujifilm launching another zoom will be no great surprise to anyone, and some people will gripe about the modest T4.2 long end - though really, 300mm on something which looks like a chunky ENG lens means that’s more or less inevitable.

There’s also no price given, on the fairly basis that few if any of these have actually been made, and the company is understandably hesitant to announce a number without being sure it’s achievable. This (mockup of a) lens is perhaps not solely targeted at the cinema market; it’ll presumably be seen on the most capable broadcast cameras covering glossy events, particularly at high or very high resolution. That notwithstanding, the tendency of high-end, single-camera drama shoots to shoot large amounts of their runtime on Fuji zooms, and then claim they didn’t, is something everyone knows but nobody talks about. 24-300 covers a lot of ground and modern cameras are fast.

And now, the end is near...

We’ll wrap up NAB 2023 tomorrow as the remains of the newsroom coffee, having been subjected to a vigorous, four-day boiling, congeals into a sort of journalism-fuelling caffeine syrup. In the meantime, let’s go and make one more round of the show floor in the knowledge that the exhibitors are now at their most exhausted and there’s never been a better time to canvas them for injudicious comments on the next big thing.

Tags: Post & VFX Production