<img src="https://certify.alexametrics.com/atrk.gif?account=43vOv1Y1Mn20Io" style="display:none" height="1" width="1" alt="">

Living with two different types of 4K

2 minute read

LG/RedShark PublicationsTwo types of 4K

RedShark 12 Days of Christmas Replays: Some people don't like it when you call UHD 4K. We think it's okay as long as you do it like this...

 Most consumers probably don’t realise that there are two resolutions different resolutions that could reasonably be described as 4K. There’s 3840 x 2160, which we’re supposed to call “UHD”, and there’s 4096 x 2160, which is the resolution described by the DCI standard for digital cinema. DCI 4K is slightly wider than UHD, but not much. But it’s different enough that if you try to show UHD 4K on a DCI 4K screen you’ll get a vertical black bar at each side of the display, and if you scale DCI 4K to fit on a UHD 4K screen, you’ll have black bars at the top and bottom. The alternative would be to crop the edges of the DCI 4K image so that it fits perfectly on the UHD 4K screen, which, in some ways, is a better solution because rescaling an image always makes it slightly softer, unless the rescaling factor is half or a quarter or some other whole number multiple.

It’s somewhat inconvenient that we have two “standards” for 4K. How did it happen? Well, it really just evolved like that. DCI 4K (4096 x 2160) is twice the existing 2K resolution (four times the number of pixels), which, itself is annoyingly just a few pixels wider than Full HD. Rather more helpfully, UHD (3840 x 2160 is exactly four times Full HD.

We (RedShark, that is) have always described both cinematic DCI 4K (4096 x 2160) and UHD (3840 x 2160) as 4K. We do this for two reasons. First, they are very close. You would have to be very pedantic to say that you can see a difference in quality between them - especially when some people still say you can’t notice the difference between HD and 4K! Second, because we are not alone in this. Most TV manufacturers - and camera makers too - describe UHD as 4K. We just see them as two types of 4K.

In circumstances where it matters to distinguish between them, for example when we are talking about the exact raster size of an image, we will make it clear whether we’re talking about DCI 4K or UHD, but the rest of the time, in our view, it really isn’t necessary.

So, just to sum up. Most of the time, when we’re talking about 4K, we will just say “4K”, whether it’s DCI Cinematic 4K or UHD. The point is that either of these 4K standards is approximately twice the linear resolution and has four times the number of pixels of Full HD. This is in much the same way as if there were two shades of red that were so close together that you had to resort to Pantone numbers to distinguish between them. That certainly wouldn’t stop you calling both of them “Red”. It’s only if your red car needed a partial respray that you’d bother with the Pantone numbers. (Actually, they don’t use Pantone numbers for cars, do they?)

Where we do need to distinguish between the two sizes of 4K, we will do it exactly like we have in this article, like this: DCI 4K (4096 x 2160) and UHD (3840 x 2160).

This isn’t going to change the world, but it does mean we can stop using the term “UHD”, which we think was never the most helpful way to describe a 4K television.

Tags: Studio & Broadcast

Comments