A few days ago we ran this article about the difference between 4K and UHD. Most of the time it doesn't matter, but sometimes, it's a crucial distinction.
In a fascinating comment following the article, David Abramson, based in LA, gave us a fascinating picture of the real background to 4K. Here's what he said:
"As a feature film editor the meaning of "4K" was clear to me over a decade ago, before it became a marketing term. To complete the picture for your readers, "4K" should correctly refer ONLY to the horizontal resolution. The vertical resolution is then derived by the desired aspect ratio (cinema uses square pixels, so the pixel aspect ratio does not come into play). 4K is 4096 across. Similarly 2K is 2048 across. Every other usage of those two terms is incorrect in the feature film / DI world. Somehow, we escaped that confusion with 2K vs. HD, but failed to dodge the marketers' bullet this time around. For the record, the cinematic definition of "4K" came first. ;o)
Also of note: the 4K and 2K standards in film work also describes minimum color depth. Developed in an age of far less computing horsepower, the 2K standard in particular makes a very intelligent compromise between resolution (pixels) and bit depth (range of colors). When I was working on the DI of "High School Musical 3" in 2K, we were occasionally also screening 1080p / HD visual effects material. The difference in apparent resolution was clear upon projection (and was not caused by scaling -- I checked). In other words: color has much to do with apparent sharpness. More precise anti-aliasing is the reason why: edge detail can be better described by the finer "color resolution" (color depth). This is largely missing from the public discussion of what the future of acquisition, transmission, and display technologies should be.
Greetings from LA!"
Thanks for the insight, David!