Ever wondered how Digital Video Effects Work? No one said it was simple, but here’s an easy to understand introduction to digital video effects processing
Video FX or VFX. We've all seen 'em. From the weatherman in front of a map, to the motion-tracked, multiple composite of live action, virtual backgrounds and CG animation. These two examples, and everything in between, have one thing in common. Digital video is simply a regular grid (a “matrix”) of numbers representing varying shades of light and dark, and software has the seemingly miraculous ability to manipulate those digits.
In this article we'll be looking at how software uses those numbers to create Video FX.
It's all in the matrix
(When I refer to VFX software, I'll be referring to either a stand alone VFX program such as Adobe's After Effects or the effects filters and plugins available in most NLEs such as Lightworks.)
Although there are various ways these number matrixes can be laid out in a video stream, for our examples we’ll use the one most VFX software uses, RGB(A), the A standing for Alpha channel. This channel plays an important roll in VFX, which we'll get to in a moment. First, let's look at the R, G and B channels, which stand for Red, Green and Blue. Mixing values of each of these primary colours can yield millions of different colours. Those different values for each colour are recorded in the video stream. For 8 bit colour, which is the norm for consumer and low to midlevel professional video cameras and for most final delivery formats, there are 256 levels of light and dark for each colour channel. There is also higher bit-depth video, 10 bit, 12 bit and 16 bit, which can yield even greater image and colour fidelity since they allow for a greater number of different shades of light and dark. Something many VFX programs do before manipulating the numbers is convert those integer bit values into floating point numbers ranging from 0.0 to 1.0, where 0.0 black and 1.0 white. So a mid level value in 8 bit, such as 128, would be converted to 0.5. This is what Lightworks does. Now it's just a matter of how the VFX software uses these numbers and channels to create an effect.
Alpha and Transparancy
I don't believe there's any video camera that can record anything usable in an alpha channel, if it records it at all, but 3D computer animation programs can do this. Let’s say you have a 3D logo that you’d like superimposed over another video. The 3D program can record a value of 1.0 (or 255 in 8 bit video) into the alpha channel of each pixel where the logo is and a value of 0.0 where the background is. Now, bringing both into you VFX program and adding these to your timeline with the logo above your video, you can tell the program to use the alpha channel in the logo video to "composite" the two videos together. Where ever there's a 1.0 in the logo video's alpha channel, the background video will be blocked and only the logo will be displayed. Where ever there's a 0.0 in the alpha channel, the software will block the logo video and allow the background video to be displayed instead. If you want the logo to be semi-transparent, your 3D program could record a value of 0.5 into the alpha channel where the logo is, with 0.0 for the background value. In the VFX program, where the logo is, the software would equally mix, or average, the values in the two video clips. Of course, there are usually many controls in the VFX program to adjust the values as required for varying levels of transparency.
Green Screen trickery
Now let's look at a Green Screen, for example. In your 3D animation program you could set the background around the logo to a bright green colour instead of setting values in the alpha channel. The RGB values where the green is will have higher "G" values than the "R" and "B" values. Your VFX software, when you apply a green screen effect onto a background video and your logo video, will analyse the RGB values of each pixel in the logo clip and give each pixel an alpha channel value based on how it interprets those values. Where the green value is predominantly higher than both the red and blue values, the program will put 0.0 into the alpha channel and 1.0 anyplace else. This allows the background clip to show through where the green is. Most VFX programs will allow you to choose the "chroma key" colour but green or blue are usually used. Also the ratio of green to red and blue can be adjusted to fine tune the analysis.
Digital Signal Processing or DSP algorithms can also be applied to the pixel values of your video. By looking for large differences between adjacent or closely spaced pixels and averaging between them before being displayed, a low-pass filter is actually being applied to the image. If the analysed group of pixels all have values nearly equal or their differences are below a certain threshold, they are displayed unchanged, but if their values vary above the threshold, their values are changed to an average of all the pixels before being displayed. This is basically how blurring is implemented. Blurring is a low pass filter. Conversely, if the pixel values in the analysed group are found to be nearly equal, the lower values can be lowered further, darkening those pixels, and those with higher values can be raised to brighten those pixels. This is one way of sharpening or applying a high pass filter.
As you can see, VFX creation is simply using mathematics to analyse and change the number values that represent the different shades of light, dark and colour. The only limit to the number of ways these numbers can be changed is the software itself. A program like After Effects can manipulate these numbers in hundreds of ways: from a simple green screen effect - all the way to analysing the numbers in adjacent frames of video looking for and finding patterns that allow the program to track the motion of those patterns across multiple frames.
Lightworks and VFX
Lightworks also has VFX capabilities built in, ranging from blurs and sharpening to colour correction, chroma key, pans, zooms, rotation, and many more. Lightworks uses DirectX pixel shaders for GPU accelerated effects. These pixel shaders are created using a simple programming language called HLSL (High Level Shader Language) that provides a way to create algorithms that manipulate the pixel number values of your video.
In my next article we'll look at this pixel shader language and how to write your own VFX for use in Lightworks.