02 Oct 2018

Lighting flicker is turning into a big problem

  • Written by 
  • submit to reddit  
Lighting flicker is turning into a big problem Shutterstock - Ed Ward

If you've watched an episode of The Grand Tour and noticed the car LED headlights flickering during the slow motion shots, you've just witnessed an ever increasing problem with modern lights. Flicker. And it's not an issue that's solvable by traditional methods.

Years ago, there was really only one flicker problem with lighting. Some lights would flicker at the local mains frequency, which would always be either 50 or 60 times a second, and there were ready references available to solve the resulting flicker problems. Many people still have the numbers in the back of their brains: shoot 24 frames per second with a shutter angle of 172.8 degrees and the resulting exposure is 1/50 of a second, so 50Hz lighting will always include one complete cycle of bright and dark periods in every frame. For 24-frame pictures in 60Hz regions, people shooting 50-frame pictures need to select a shutter speed that's either 1/60 or 1/120 of a second or an equivalent shutter angle on film, so again, a complete mains cycle is exposed on every frame.

The old methods don't work any more

Unfortunately, this doesn't work anymore. It's based on the idea that lots of lighting technologies use transformers. Transformers work by feeding the mains power into a coil of wire wrapped around a chunk of iron. As everyone who's done a high school science class knows, that turns the iron into a magnet. Wrap another coil of wire around the whole thing and the magnetism creates an electrical current in it. Vary the number of turns in each coil, and we can vary the way power flows into and out of the transformer. That's how old-school magnetic HMI and fluorescent ballasts work and it's fine, except for the fact that when the 50Hz power goes in, 50Hz power comes out, and the light flickers at that frequency.

It's also not very space efficient. The amount of power that can flow through a transformer is pretty much fixed – the iron core can saturate above a certain amount of magnetism, so the maximum amount of power transferred per mains cycle is fixed. One solution is a bigger transformer. A better solution is to use more cycles per second, to get more power through the transformer. That's the approach used by modern power supplies in everything from cell phone chargers to LED power supplies. Take the mains power, smooth it out so it doesn't cycle anymore, and then chop it up into lots and lots of cycles per second. A conventional mains transformer operates at a few tens of cycles per second; a more modern design may operate at hundreds of thousands.

This means we can get much, much more power through a small transformer. The problem, of course, is that the lighting isn't flickering at the mains frequency anymore. What frequency is it flickering at? Well, whatever the designer of the electronics felt was a good idea. Generally, higher is better, since that makes the transformer smaller, and bigger transformers are made of more copper, which is expensive and heavy. Everyone wants things to be cheap, small and light. Often this is good for cinematographers since higher frequencies generally mean less flicker, but this gets complex fast and we can't really have any idea what the flicker rate actually is.

No solution

There is no universal solution to this. There are as many different frequencies in use as there are power supply control microchips and designers who use them. Well-engineered devices, including LED lights designed for film and TV work, smooth out the output from their power supplies so it isn't a problem. Otherwise, there's little reason to bother – it doesn't worry the human eye if, say, an emergency exit sign is flickering 150 times a second. Almost anything that's been designed to emit light in the last decade has at least some chance of creating this problem and there's no way to detect it without special tools, and absolutely no way to fix it without replacing the light source.

It's becoming increasingly necessary to take the actual production camera on location scouts, configured as it'll eventually be used. Shooting some stuff on a cell phone or another camera might show you something. However, with almost all cameras suffering rolling shutter and cell phones often suffering very inaccurate frame rate timing, problems might be exacerbated or obscured, so it's a much better idea to take the real camera. Some of the more subtle problems can be hard to see, so shoot a minute or so of a grey card and then punch up the contrast and skip through it on an NLE timeline, looking for dark and bright bars or overall changes in exposure.

All we can do is test. Good thing cameras have never been cheaper, eh?

Title image: Shutterstock - Ed Ward


Phil Rhodes

Phil Rhodes is a Cinematographer, Technologist, Writer and above all Communicator. Never afraid to speak his mind, and always worth listening to, he's a frequent contributor to RedShark.

Twitter Feed