11 Feb 2018

All about sensors, lenses and depth of field

  • Written by 
All about Sensors, Lenses and Depth of Field All about Sensors, Lenses and Depth of Field RedShark


Replay: You can't go far without understanding the relationship between sensors, lenses and depth of field, only it's slightly more complicated than you might have thought!. Phil Rhodes makes it understandable.

Part two of this series on sensors and lenses is available here

Since the earliest camera obscura, humans have used technologies which project images of a real-world scene onto a screen. Only quite recently have we gained the technology to automatically record that image, although we should probably take a moment to think about the potential use of camerae obscurae by old masters such as Johannes Vermeer. The documentary feature Tim's Vermeer details attempts by NewTek founder Tim Jenison (yes, he of Tricaster and Lightwave fame) to work out how this might have been done, including an ingenious approach to colour matching which could almost be thought of as assisted manual photography. It's well worth a watch, although if some of the suppositions around the subject are correct, the effective sensor size applicable to Vermeer's paintings – in terms of a modern camera – would have been the same size as the finished canvas, over 15 by 17 inches. That's positively gigantic by any standards, and anyone with any knowledge of the attendant issues will already be frowning about light levels, depth of field, and other parameters. Back in the world of modern cameras, we currently enjoy (or perhaps we suffer) a huge number of sensor size options. The physics, though, is exactly the same, whether we're talking about Vermeer's canvases or the sub-fingernail-sized slivers of silicon in the average modern webcam.

Landing the image

Most people understand the idea of a lens projecting an image, usually circular. That part of the image which falls on the sensor becomes the picture we see. A mattebox with inserts to suit a particular aspect ratio might crop the projected image closer to the intended final frame, with the idea of limiting flares caused by extraneous light bouncing around inside the camera or lens. Overdoing this can cause problems because a lens focusses light reflected from the subject over various angles, not just those which happen to pass directly through the centre of the lens. A matte cut too closely to the shape of the frame might itself be out of shot, but might still darken the image or create vignetting.

But the principal real-world concern of landing an image correctly on a sensor is one of the flange focal distance, the distance between the lens and the sensor. Some lenses offer adjustable back focus, but many rely solely on the mechanical alignment between the mount and the sensor. Because the distance between the sensor and the lens is very small, but the distance between the lens and the subject is very large, a large ratio is involved, and the alignment must be very precise. Many lens mounts have the option of inserting shims (very thin sheets of metal) between the mount and the camera body, to allow fine adjustments to be made. If this is not done correctly, all may initially seem well, but issues such as inaccurate focus distance marks and an inability to reach infinity focus may occur.

« Prev |

Phil Rhodes

Phil Rhodes is a Cinematographer, Technologist, Writer and above all Communicator. Never afraid to speak his mind, and always worth listening to, he's a frequent contributor to RedShark.

Twitter Feed