31 Dec 2017

What is the "cinema feel"?

  • Written by 
  • submit to reddit  
What is the "cinema feel"? Shutter stock


RedShark Christmas Replay: There's a quality to the cinema experience that 'cinema feel' people discuss in relation to depth of field, grain, contrast and other image characteristics, all of which have changed significantly with digitalisation. But why? Guest Author John Clark delves into frame rates and the science of perception.

As well as all the characteristics above, there is another another contributory factor where digitalisation means fundamental change and that is frame rate, or rather the lack of it. Unlike the director, or editor, who are eager, or exhausted by their attention to detail in post production, the typical cinema-goer is there to sit back and enjoy high-end production, knowing they are neither threatened, nor involved. Relaxation is important when enjoying movies — the thrills, the visuals, or giving thought to implications of character and plot. No-one wants to experience the real terror of being shot at, or getting caught in a car crash, though the inverse problem might explain why so many erotic sequences fail to hit the spot.

Digital projection is sometimes seen as being 'too smooth' to quote a British producer who considered introducing grain into a final cut of a high-end cinema film, while Peter Jackson's high-frame rate shoot at 48fps for 'The Hobbit' brought criticism for yielding video like images.
A champion of 'immersive cinema', cinematographer Douglas Trumbull ( douglastrumbull.com ) set up some large scale tests when developing 'Showscan' in the 1980s and 90s to explore the options of film production and projection with different frame rates. He has expressed the interesting opinion that movies might be created with variable frame rates, to match the mood of the film with the experience of the audience, who are subliminally sensitive to frame rate. The simple assumption that more is better, may not turn out to be the case.

We still tend to assume that the camera somehow mimics the eye and brain, while a projector is basically a camera in reverse, which was indeed sometimes the case in early cinema and for amateur film-makers who could adapt their cameras to project their film after processing. However, for Digital Cinema the Digital Light Projector ticks along at 144fps, whatever the official frame rate of the movie, with individual frames described as being 'flashed' more than once to fill the available time (though the very concept of 'frame rate' and 'flashing' shouldn't really be applied to 3-chip DLP's, whose 'micro-mirrors' oscillate 50,000 times a second with information up-dated in a modulated form to change each mirrors' pitch). This remarkable technical achievement is quite different to the traditional 'maltese cross' light/dark approach to match the stop/start motion of a print as it moves through the projector, or the scanned lines of a monitor. The notion of 'persistence of vision' building a perceptual bridge between one frame, or half-frame and the next need no longer apply.

When the Digital Cinema Project was defined, an obvious priority was to see that the new system ensures the massive libraries of movies shot at 24fps can still be seen, and using JPEG2000 for DCP files creates high quality archive material for the future. A format war was in no-one's interest, when billions were needed to re-equip cinemas around the world, but equipment manufacturers like Christie are adamant that higher frame rates are on their way, which implies yet another wave of re-equipping for production and post.

What frame rates and why?

The potential multiples for higher frame rates are fairly obvious, from 144fps to 2x72, or 3x48 and 6x24 for 3-chip projectors, so the 96/second falls away as an option, unless the images are packaged as a fraction of 2 minutes rather than 1, ie 288 frames at 3x96 (and why not?). The situation is complicated by the need for single chip projectors using a colour wheel to project R, G and B successively, which limits them to 24, or 48 (and perhaps 96). Adaptations now mean material shot for 25/50fps and 30/60fps can be projected, but with over 100,000 projectors now in use, there is little likelihood that this basic 144fps set up will change for many years.

Anyone seeking to work with variable frame rates a la Trumbull could construct their digital prints to reflect the 144fps standard and might vary the number of images by including single frames, or doubling up as appropriate frame by frame, rather than double, or triple 'flashing' the same 'frame'. The 3-chip projector technology has been limited by the bandwidth available for data transfer, rather than any inherent need for frames to be repeated. With 4 million micro-mirrors co-ordinated on a 4k chip this is no small challenge, however high capacity cabling is becoming the norm, so that limitation should fall away.

The original standardisation of film around 24fps was driven by the introduction of optical sound for projection at acceptable quality and the use of double and quadruple flash in movie projectors has long meant that they are geared to 48, or 96 flashes per second of 24 frames. TV and video were pragmatically synched to the prevailing local AC electricity supplies and there wasn't much change until the arrival of progressive video on computers using lower, or higher rates as required. These standards were all adapted for good engineering reasons and economy, but only limited attention was given to aesthetics, or perception. One way, or another, they've worked pretty well and we're attuned to their feel.

It is easy to presume that higher frame rates bring an automatic improvement in quality by overcoming some traditional irritations like flicker, but they make different demands on our perception. The symmetry between camera and projector has been broken. Any change will probably come from the camera side and post-production, which brings us back to Trumbull's notion of a relationship between frame rate and audience response.

More than a feeling

To get a little closer to the question of 'cinema feel', its worth reminding ourselves that our brain is not just concerned with seeing, hearing and interpreting a movie. All kinds of basic bodily activities are co-ordinated at the same time, and alongside all the nerve signals and specialist cell responses, the brain hosts a mass of chemical operations, with fountains of chemicals flooding though it at various rates according to the demands being made on it. With nerves passing information throughout the body at a rate of between 1 and 1000 times per second, there is a massive task of synchronisation, though we lack a single organ which functions like a clock.

The visual pathway from retina through the brain to the primary visual cortex was described by David Hubel which won him a Nobel Prize in 1981 and his milestone book, 'Eye, Brain, and Vision' is available chapter by chapter on the Harvard Medical School website. Checking with someone who ought to know, experimental psychologist Niko Busch, a professor in Berlin's research group 'Mind and Brain', he confirmed that Hubel's account stands even though his work represented research from the 1950ss onwards.

Busch ( http://oszillab.net ) has been involved with various studies of flicker and its influence on the brain, using standard psychological testing methods and electroencephalography (EEG) to accurately assess the electronic impulses arising in the brain due to flicker and other short term events. One outcome was that the brain continues to respond to flicker at frequencies higher than those where the observers have ceased to be aware of it. People's awareness of flicker varies and in his test group only about 15% were noticing flicker at rates above 50Hz and none above 55Hz, while everyone's brains were registering a response up to 100Hz and an outlying response at about 160Hz.

Equally surprising, people also judge duration differently. Under flicker there is a tendency to overestimate the passage of time compared to a situation without flicker. In another experiment, sudden 'saccidic' eye movements were stimulated by distracting the viewer and this impaired their ability of detect changes in images shown before and after the event, say an object within a photograph, where a bit of photoshopping had changed its colour or position, or removed it completely from the image, though the observers weren't necessarily aware of any eye movement. This might be good news for props and continuity people, whose errors might be overlooked, but it isn't so encouraging if a subtle change is crucial to the plot.

While it would be misleading to imagine that the brain sees images in discrete frames, or has a particular frame rate, Busch mentioned that the rate of nerve impulses slows in different regions of the brain and that in the areas dealing with higher level functions this is only around 10 per second, which I found surprisingly low. Electrical activity in the brain is much more complicated than the intuitive assumption that low levels of activity are associated with sleep and frequencies simply increase with our level of awareness and attention. There are a number of other factors that might help us appreciate the complexity of visual perception.

« Prev |

Guest Author

We're on the look-out for new writing talent to join the RedShark line-up.
Send your proposals, outlines or complete articles to editor@redsharknews.com

Twitter Feed