OLEDs have been knocking around for longer than LCD screens, but are only now starting to really impact on the mainstream. And while there is still room for improvement, the feeling is that the momentum behind them is finally growing.
The overwhelming majority of times, TV pictures have been displayed on one technology: the cathode ray tube. It's only been in the last decade or two that we've seen the common use of TFT-LCD and briefly plasma as well as the very occasional back-projection device using a micro-mirror panel. OLEDs have actually existed for longer than LCD televisions have been common, though. They were made practical at Kodak, of all places, in 1987. The fact that it has taken this long for them to even begin to reach the mainstream is an indication of how genuinely new a technology it is.
Still, in early 2017, we are certainly seeing signs of them becoming mainstream. There have been OLED displays in car radios for years, but only in the last year or so has it become easy to buy a home TV which uses the technology. LG has had displays for a while and Panasonic is already promoting types with all of the acronyms — HDR, 4K and OLED all in one box — to become available, according to people in Panasonic shirts in consumer electronics stores, around the end of May. This ought to be great. When everyone has a display capable of mastering-grade images, we'll be able to expose and grade images using the full capabilities of cameras and transport media in the full expectation things will be rendered as intended by the consumer's display.
The status display on the side of this F65 is an OLED too - they've been common in this sort of applicaion for a while
Well, actually no, of course, we won’t. There are several reasons for this, some of which are more interesting than others. Some of the less interesting reasons are the tendency of manufacturers to build really quite good displays (honestly, home TVs have become much more respectable since TFT-LCD got good) and then afflict them with Bullshit Mode. This Mode operates under many disguises; some displays have Game Mode and Movie Mode, others refer to special post-processing, scaling or noise-reduction options, but they're all a proudly-released product of Bullshit Technologies Incorporated.
Confronted with a display like this, many consumers know only one way to behave - select the one with the brightest colours
The problem OLED solves
Professional displays like this OLED-based TVLogic LEM-250 are free of consumer-targeted encumbrances, but still need proper configuration
Naturally, no video display has any business having any mode other than “standard,” but even that's complicated when material can be supplied in many different formats and information about the format in use may not make it to the display intact. OLED doesn't solve any of those problems. No display technology can, although Dolby Vision makes a fairly respectable stab at it. However, with many consumer OLEDs to date, there have been some more fundamental issues, which have put really perfect pictures just a little way out of reach.
Some of the best displays on the planet are panels made out of traditional LEDs, which have more or less ideal performance. Cost and minimum size are limitations
To understand why it's worth a quick recap of why we like OLED in the first place. What we really want is a way to create nearly 25 million points of light (for a roughly 4K display) which are individually controllable between a very bright maximum — 1000 candela per square metre for HDR — and a minimum that's ideally absolutely black. These points of light need to be very specific shades of red, green, and blue. They need to react almost instantaneously, to be viewable from any angle without altering in apparent brightness or colour, while the surface somehow needs to not exhibit either specular or diffuse reflection. We need these things in sizes from a few inches to many tens of inches diagonally. OLED is the current best bet.
One of the ironies of this sort of new technology is that it tends to come along at a time where the incumbent has become really quite good. It's now possible to satisfy almost all of these things with TFT-LCD technology. Really, the only thing that's wrong with it is the minimum black level, occasionally a very small amount of off-axis colour shift and sometimes a touch of lag on really fast changes. An LCD display can emit any colour that comes out of the backlight (invariably LEDs, these days) and can be screen-printed onto the filter layer. Making OLEDs emit the right colours is a matter of formulating the materials used to make them in order to provoke the right physics (by default, the basic arrangement emits a sort of yellow-green.) The most crucial problem with OLEDs, however, is sheer output.
OLED displays can be transparent, as with these see-thru advertising hoardings
The problems with OLED
Professional OLEDs can be bright - this image is exposed for the monitor display, not the surroundings, which were not as dull as they look here
Fill the screen with white on some current consumer OLEDs and the display brightness will plunge below a couple of hundred nits. This is a rare situation, of course. The peak brightness is often a more noticeable problem. It is essentially impossible for affordable, consumer OLEDs to hit the higher, 1000 nit target for consumer HDR. Their inky blacks help with the impression of strictly comparative brightness and watching in subdued light helps too, but it's ultimately an issue of the underlying technology. OLEDs aren't bright in the same way as normal LEDs (as are used to backlight most TFT-LCD displays) and if they're driven too hard, they tend to age quickly. This is especially a problem with the blue channel. Electronics designed to emit blue light is, by its nature, working harder than those emitting red. Blue light is at a higher frequency than red, and therefore represents more energy. It's directly for this reason that blue LEDs have a higher forward voltage than red. The blue tube in old-style CRT projectors was always the limiting factor for overall brightness.
Blue is hard and some early OLEDs were notorious for having their blue elements fade, resulting in a gradual yellowing of the image. This was bad enough on a studio monitor where shorter lifetimes may be an acceptable tradeoff for the zenith of performance, but it's absolutely not an option for a piece of consumer electronics. Because of this and other issues surrounding brightness, some consumer OLEDs have actually used white-emitting phosphors, and tinted them with filters, like a TFT-LCD. Beyond that, some of them have also included display elements intended to emit only white light, so as to achieve higher peak brightness. This smacks slightly of desperation, as it naturally has the effect of desaturating colours as the brightness increases and makes the display noticeably less accurate.
Blackmagic's viewfinder was possibly the only full HD OLED one available at launch, an achievement overshadowed by the camera it is designed for
Whether the next generations of OLEDs will continue to suffer these problems remains to be seen, but there's still plenty of room for improvement. A sensible long-term target might be at least 2000 nits for really good HDR, with the maximum peak white achievable regardless of the area of the display it fills. Existing displays — overlooking the issues of adding a white element for more brightness — tend to have reasonable colour performance as it is, although that's often quoted in terms of the DCI-P3 colour space, not the (much) more demanding Rec. 2020 primaries, which are extremely difficult to fully cover, but which represent a worthwhile target nonetheless.
Consumer OLEDs can be quite good – but we still would like to see better.