Phil Rhode's gives us his report from the recent BSC Expo in London. There were some big screens on show!
There’s a certain traditionalism associated with the BSC Expo in London, associated as it is with an organisation dating back to barely post-war filmmaking. The exhibitor base is determinedly high-end and not a little bit inclined toward big, heavy cameras and big, heavy lenses on big, heavy bits of grip equipment. Nothing’s ever quite money-no-object, but it’d be easy to associate the BSC Expo with the idea of money being not much object.
But if there’s any technique that’s sometimes too expensive even for the high end, while still being an object of fascination to more or less everyone it’s LED volumes. And yes, if there was a detectable trend at the show it was both giant video walls and interactive lighting designed to complement them. The zone – it’s hard to refer to such a postcode-having area as a mere booth - occupied by virtual production specialists disguise and retailer CVP at the entrance to the show was dominated by the curved display seen in the photographs accompanying this article. Much as it’s the largest TV most of us will ever have seen up close, it’s also fairly small fry in terms of LED volumes in general.
Going by the conversation around that wall, there are a few common questions. The reason the wall looks like it has a region in the middle with a smaller copy of the image in it is because that’s (approximately) the part of the wall the camera can actually see. The rest of the display shows a static image approximating the environment from the actor’s point of view, the better to provide interactive lighting and something for the actors to look at. As the camera pans and tilts, that inset area moves around on the LED wall and may change size.
Notice the cellular pattern just visible in this shot. The stills camera which took this was not synchronised with the GhostFrame tracking data, meaning the pattern can be visible.
Overhead, we can see a row of Quasar Science LED tubes, which can, via various control protocols, use their subpixels to produce animated lighting effects based on the 3D environment. That keeps the environmental simulation going outside of the screen area and provides for better colour quality than the (dismal) colour quality of the display itself. Some LED volumes, which can then be more realistically called a “volume” as opposed to a “fancy animated Translight”, put more display panels overhead to the same end, with the additional benefit of, well, being able to point the camera upward. If all we want is the interactive lighting, the Quasar tubes are a much (much, much) cheaper way to cast similar light.
So far, so straightforward, but what’s that strange pattern of irregular shapes overlaid on the LED wall image in some of these photos? Notice the TrackMen camera slung under the taking camera’s lens, which is performing the camera position tracking using a technology from another company, GhostFrame who in turn use an LED Processing platform from Megapixel VR. Tracking data (the shapes) is superimposed on the image without being visible to the taking camera by leveraging the necessary synchronisation - the tracking patterns appear only when the taking camera’s electronic shutter is closed. Thus they’re visible to the tracking camera but not the audience. Similar techniques can be used to support more than one camera in a multi-camera live LED volume, with the cameras kept deliberately out of synchronisation so that neither sees the background image rendered for any other.
TrackMen tracking camera on CVP's LED volume demo.
CVP wasn’t the only company showing an LED wall, with VFX World and Mo-Sys teaming up to show a high resolution, 2.6mm-pitch wall with Mo-Sys’s StarTracker system which relies on a pseudorandom scattering of retro-reflective dots above the scene, and an optical tracking camera that looks straight up. The joint display was careful to discuss not only the most visible parts of the setup – the panels, camera tracking and rendering gear – but also hardware from people like Brompton, who make the devices that drive the panels and are crucial to things working as they need to.
These are very complete, very up-to-the-second LED volume setups. It is certainly possible to use LED video walls as simple backdrops without all that sophistication – notice the integration with the foreground scenery almost works even in these casual snapshots, and people have done good work using LEDs to play back simple video footage. The days of LED walls automatically looking cyan without serious corrective work are over.
We’ve already talked about Quasar’s pixel-capable tubes; Sumolight’s SumoSky is a rather larger hybrid solution designed to create broad area effects based (at least potentially) on video data. It’s built from a comparatively sparse framework of LED battens, each with a single row of programmable emitters designed to backlight a large diffusing textile positioned a few inches away. The result is necessarily low in resolution, but an effective way of creating not only a convincing sky backdrop but also interactive lighting.
A sneak peek at the LED battens behind Sumolight's SumoSky illuminated backdrop. The textile is about six inches from the lights.
2022’s BSC show certainly wasn’t dominated by LEDs driven by video data, but it’s certainly the thing of the moment. Of course, stereo 3D was also once the thing of the moment, but that was always slightly controversial, with people staggering cross-eyed out of demonstration shows and trying to convince themselves all last year’s problems had been solved, using exactly the same excuses they’d used the previous year. Few people have any serious objection to the effectiveness of this sort of LED-based virtual production; even the actors like it better than trying to react to green screens.
Probably the best thing about it is that the traditional techniques of lenses and lights – things the BSC itself was built to conserve, teach and celebrate – work really nicely when the virtual backdrop is actually there on the day, and it’s noticeable that even the greyest beards in the venue were queueing up to ask questions with everyone else. Nobody wants to be the compositor on a flashlights-in-the-lens sequence shot greenscreen on anamorphic lenses, after all. Usually we’d have to skip the flashlights or the anamorphics; now we can skip the green, which is a solution that seems to satisfy more or less everyone.