Here's another chance to see this article about how there's more to HD than we realise. It might be that we're just not using it properly!
Is HD good enough? It might be. It certainly looked good enough around eight or ten years ago when most of us saw it for the first time. In fact it looked incredible: it was hard to imagine that one day all video would be like this. So what happened? All the talk today is about a standard that's four times better than HD (in fact 4K is only twice the linear resolution of HD).
Well, nothing happened. We just got used to it, and our expectations have climbed. That's if you exclude the apparently still quite large number of viewers still watching in SD on their HD sets, blissfully ignorant of the true nature of HD. These people, one imagines, will not be rushing out to buy a 4K TV any time soon.
The only sense in which HD isn't good enough is if you were to buy a TV that's four times the size of your current one. To keep the same visual quality, you'd need 4K instead of 2K. Such comparisons rarely arise in the natural world though, and meanwhile, rather surprisingly, most of us watch films in the cinema in a resolution that just about scrapes the bottom end of HD (unless we're in a theatre that's specifically equipped with a 4K projector showing 4K sources).
But none of the above is the real point of this article, which is to argue that there's plenty of mileage left in good old HD. It's just that you have to use it properly.
The best thing I'd seen on TV
The first time I ever saw Digital Betacam footage from a TV studio, it was the best thing I'd ever seen on TV, by far. If someone had told me it was HD and not SD, I'd have believed them: it was so sharp - and every single line had quite distinct information in it. It looked nothing like the fuzzy, mushy SD that arrived in viewers' living rooms in the days of analogue TV transmission, and equally nothing like the scrappy, pixelated, artefacted stuff that we've become accustomed to since Digital TV became the norm (in these parts, anyway).
That's part of the reason why DVDs still look so good: there's more bitrate available for playing back a DVD. While it's still compressed in pretty much the same way as Digital Terrestrial or Satellite TV, it isn't constricted in the same way. Clever variable bitrate encoding ensures that there are no bottlenecks, and so bad compression artefacts are rarely seen.
Blu-Ray, which is true HD, is a big step up from DVD, and can look absolutely superb. You rarely hear people saying as they watch their Blu Ray edition of Avatar "if only it could be a little bit sharper". You probably wouldn't hear it much if they were projecting their Blu Ray onto a fifty foot screen, either.
There's a lot in common between digital video and digital audio. What audio concedes to video in bitrates it claims back in complexity when you consider that a typical digital studio session will have perhaps 48 or even 96 tracks. It will probably be recorded at a higher sample rate than CDs (96 or 192 KHz as opposed to 44.1 KHz) and a greater bit depth (24 bits as opposed to 16 bits).
In the early days of digital audio as a domestic format, we had CDs (44.1 KHz, 16 bit) and - for a while, until the format was scuppered by piracy paranoia - Digital Audio Tape (also 44.1 KHz, 16 Bit).
The problem with early digital audio recording was that if there was an unexpected peak in the sound, it would use up more than the avaialable 16 bits, and would crash into a hard brick wall. Digital distortion is very unpleasant and needs to be avoided at all costs. So early digital audio recording engineers played save by treating their recording media as if it only had 12 bits maximum. This gave them a margin for safety and lowered stress levels in the studio.