This is the first of three articles on concsecutive days that explores the reasons for using 24 fps. As our ability to capture and display much higher framerate that this has blossomed, so has the debate about which number of frames we should show per second. On the face of it, higher framerates should look more real - but in practise, instead of the expected reality, we see cheap-looking TV. 48 fps makes everything look like a soap opera, apparently.
Counter intuitive. Puzzling. What's going on here? Phil Rhodes explores.
One more thing before we start: this is a fascinating debate. We are still working on RedShark's comments system, but, meanwhile, if you'd like to respond to these articles, please email firstname.lastname@example.org. We'll publish the best comments.
Anyone who was at any one of a few industry trade shows this year might have seen James Cameron's demonstration of high framerate filmmaking. Given his stated intention to shoot the inevitable Avatar sequels at 60fps, and that The Hobbit was shot at 48, you'd be forgiven for thinking that there was significant demand for framerates above the traditional 24. As it happens, it looks like there isn’t. Test audiences, including me, have found the effect too much like video, too much like a soap opera. In some ways, it’s an indication of the advance of technology that exhibition framerate is now something that we can practically choose to alter, even if this is mainly due to the high framerate demands of 3D projection and the desire of cinemas to offer non-feature shows involving sports or performing arts which are originated live, at more conventional television framerates. This being the case, it seems like a good time to look into our old friend 24p and consider where it came from and why people like it so much.
Regardless of the framerate in use, the need to standardise them didn’t even arise until the dawn of synchronised sound. While modern audiences immediately recognise the hurried walks and frantic gesticulating of silent movie characters shot at around 16fps after the turn of the twentieth century, cinemagoers during the late 1920s could expect to see material shot and exhibited at anywhere between about 15 and 30. Once the earliest sound systems began synchronising a record player to the projector mechanism, it was realised that the projection speed must remain constant, since variable-speed sound is much less acceptable than variable-speed picture.
Even so, it was not until the advent of optical sound, with both picture and audio combined on the same physical medium, that the familiar 24fps rate became standard. It was chosen, presumably compromising the inevitable objections of producers to go slower and engineers to go faster, simply because 24 frames per second of 35mm film equates to ninety feet per minute, which was the slowest speed which allowed for reasonable audio reproduction using the techniques of the time. It’s worth pointing out that modern 35mm analogue optical sound uses a variable area soundtrack which has much higher performance than the original variable-density one, and I hate to think what sort of mess we’d have ended up with had the producers of the time foreseen the opportunity to go even slower and save more film stock.
And it is slow. When updated 24 times per second, a moving picture is not being updated nearly quickly enough to completely fool the human eye into perceiving motion, particularly when fast, chaotic motion is involved. Pans and tilts can cause appalling judder, and the success of 24p imaging in any context is heavily dependent on motion blur and human persistence of vision to produce a result that’s in any way watchable. In the context of this imperfect system, it’s probably reassuring that 24fps wasn’t actually chosen for any reason to do with picture performance whatsoever – it was chosen with an eye on sound performance.
We are still working on RedShark's comments system, but, meanwhile, if you'd like to respond to these articles, please email email@example.com. We'll publish the best comments.