<img src="https://certify.alexametrics.com/atrk.gif?account=43vOv1Y1Mn20Io" style="display:none" height="1" width="1" alt="">

Veteran RedShark reader offers sage advice on video transmission, 4K

3 minute read

In a recent letter to RedShark Editor, David Shapton, veteran RedShark reader Joe Wellman schools us whippersnappers on the technical aspects of video transmission, including 4K.

I recently have been thinking about the technical articles that I have read in RedShark. So of these that I have read, I have felt compelled to comment on some of them based on my training and collective experience after 53 years in electronics. Certain themes seem to trigger responses which, when I think about it, I wonder why I bother. I say this because my technical mind is arcane! Why, you might ask is this so? Well, after considerable thought, I figured it’s time to explain.

You see I lived in another time! When I started in my field of endeavor (electronics), most of what I dealt with was analog format and that information was transmitted in a signal. Because of this, I learned about components which make up the electronic devices, dc and ac circuits, current signal flow (which is the transmission scheme), and technical terms which tie everything together.

The component parts are physical in nature, and while the delivery scheme in electronics has shifted from analog to digital (in some cases), all things I learned remain true. I realize that most of the current technical people were not trained like I was!

This became apparent when, in the early 70s, I first got into video. I noticed when I discussed transmission of video signals, some (not all) of my fellow video people had issues understanding the relationship and problems that could occur if adequate bandwidth was not available in an amplifier. They lacked the basic information concerning transmission line electrical equivalent and termination problems. They understood the makeup of the video sync signal, horizontal and vertical rates and that the color subcarrier had to be present to constitute a color image. They also understood transmission delay in that the longest run of video cable had to be matched length wise to all other cables at the switcher or router inputs, but group delay was not fully understood. They could use a vectorscope and a waveform monitor and, thanks to Tektronix, had manuals that would allow them to use a video signal generator to test video circuit devices and paths with test signals like color bars and 2T or 20T pulse but some would not understand dc clamp circuits or sync strippers. And forget Time Domain and Spectrum Analyzers, which were unknown devices except to the chief engineer at a terrestrial broadcast facility.

After the transition to digital video, I watched the problem get worse. While there are bright and well trained engineers and technicians out there, they became fewer in number as the US networks cut their engineering support staffs. It seems digital is 'more reliable' than analog, which is not entirely true, but close enough for the bean counters. Even an old RF guy like me had to put on his thinking cap when we started to deal with sampling scope equipment to see the high definition eye (is it clear?) at 1.485 Gbps data rate (like dealing with a RF transmission path at 742 Mhz). Things really get hairy at 2.97 Gbps, because connectors used at the cable end to deal with the bandwidth and termination needs to be smaller than a standard BNC connector, like a SMA or miniature BNC. Cable to carry the signal at 3 Gbps needs to be hard miniature coax, to prevent loss so grounds really count. Oh yes, video insistence of 75 ohm termination instead of 50 ohm, which requires adequate drivers to supply the current needed.

Now bundle that with 4K video, which all of your readers are gunning for. Can you imagine a post-production facility pumping real time video (no compression) at 6 or 12 Gbps data rates, because without strip line transmission or fiber optic switchers or routers, it’s not happening! The problems are in the range of esoteric electrons and smack of quantum mechanics. Nasty stuff! Never mind the co-interference signals, different grounds for digital or analog signals (yes, all the drivers are basically high frequency analog amplifiers, even if the signal is a pulse train at some serial digital rate) and all the other stuff needed, like high frequency clock drivers in processors to execute command cycles. There are not enough people trained to handle 1080P, much less anything higher.

What does this all mean, you could ask? Well, specifications and the knowledge to understand what the specs mean would be a start. Groups like SMPTE or EBU need to meet and make real decisions about standards and to educate, but not to confuse, the end users - not some hodgepodge of high pixel count cameras and record equipment tied together into a monitor which converts the signal to a native format. It is kind of like going to someone’s house and seeing them watching 4 x 3 480I or 480P on a LCD HD set with the aspect ratio set wrong (zoom), while they think are watching HD. Oh yes, keep compression formats confined to transmission schemes where they belong and not to production formats.

Well, that's my view, from my "senior" perspective. What do you think?

Tags: Technology

Comments