13 Mar 2015

How to understand video scaling and framerate conversion - part two

  • Written by 
Video Processing: Part 2 Video Processing: Part 2 Blackmagic Design / RedShark News

Index

In Part Two of our four part series on video processing and standards conversion, Phil Rhodes gives us a lesson on video scaling and why a hardware solution is the way to go.

Bigger, as the saying goes, is better. Notwithstanding the most traditional interpretation of the phrase, this is certainly borne out in the experience of cinema-goers, who have traditionally been more impressed with greater screen size, a larger bucket of popcorn and anything up to an imperial gallon of carbonated sugar water that turns your tongue blue. Making pictures bigger, however, involves more technology. Perhaps surprisingly, so does making them smaller, as was discovered by people who noticed the problems of aliasing on Canon's famous 5D Mk. II camera, whose sensor was so much larger than the output image that adequately scaling it down became a serious technical issue.

The 'what' and 'why' of aliasing

Aliasing, or the blockiness of digital images, is a characteristic that more or less everyone knows. It's the effect created by any attempt to fit non-rectilinear objects onto an imaging surface that's made up of pixels. This affects both displays and cameras, and whereas displays invariably have pixels in a regular grid, cameras such as the 5D Mk. II cluster the pixels of the sensor together in irregular shapes when in video mode, creating unusual artefacts.

At this point, it's worth understanding some of the formal information theory about aliasing. Aliasing is caused when a digital sampling device attempts to sample a signal containing information at a higher frequency than its Nyquist frequency. Practically, in terms of imaging, this could occur when a camera's lens projects a high-resolution image onto its sensor or in a scaler, attempting to reduce the pixel dimensions of an existing digital image.

Stairway to...

The test image we're using here is called a zone plate and is designed to reveal aliasing clearly; on a straight line, the same thing is visible as stair-strepping. A zone plate reveals aliasing as sets of additional circles that aren't concentric with the original circles. In this example, we can see sets of curves appearing at the edges of the zone plate square when we omit every other pixel. In order for this to look right, you'll need to ensure that your web browser and operating system is set up so that it doesn't scale images; look for a "zoom" option and set it to 100%. If the square on the left doesn't look like clean concentric circles, something is wrong.

rsn_teranex_doc2_fig1.png[Figure 1: Aliasing.]

The problem can be solved by removing the high-frequency information from the signal - that is, blurring the image - before we downscale. Notice that the half-resolution version on the far right barely looks different to the version in the middle, even though it only has a quarter the number of pixels.

rsn_teranex_doc2_fig1b.png[Figure 1a: Filtration then aliasing]

Once an image has aliasing, it cannot be removed; there is no way to tell the difference between high frequency (that is, sharp) true picture detail and the artefacts of aliasing. Sadly, this filtration is an imperfect process: camera designers must compromise between avoiding aliasing and compromising sharpness. After-market low-pass filters for Canon DSLRs can remove aliasing in video mode, but the result is often noticeably softer.



« Prev |


Phil Rhodes

Phil Rhodes is a Cinematographer, Technologist, Writer and above all Communicator. Never afraid to speak his mind, and always worth listening to, he's a frequent contributor to RedShark.

Twitter Feed