A recent study claims that UHD adoption rate is 10 years ahead of HD's (during its lifecycle). But is this a real distinction, or an illusion?
According to the good folks at IHS-Consumer Electronics & Video Technology, UHD has "hit the ground running, and is about 10 years ahead of where HD TV was just two years after standards ratification." (Thanks to Advanced Television for alerting us to this story.)
Wow! That must mean that everybody on your block is in the market for a UHD television, if they don't already own one, or a few. But let's cool our jets for a moment and unpack what's really been said.
On the surface
The first HDTV went on sale in 1998 here in the U.S., eight years after the first standards ratification. It's easy to contrast that with UHD, which saw 2013 as the year UHD televisions went on sale, or just one year after the UHD standard came into being.
The UHD standard arrived in 2012, with H.265 HEVC (High Efficiency Video Coding, allowing UHD streams at just double the rate of HD, as opposed to 4x) following in 2013. By September of this year, UHD televisions shipments totaled 14 million for 2014 and counting, making up 7% of the entire global market for all televisions.
So, in a way, the assertion that UHD is zooming along, is fairly accurate. It's really just another case of the Big Bang model of technological development that has replaced the old Bell Curve of the past, as elaborated on by RedShark Editor-in-Chief David Shapton in this artice. But it's problematic to compare the adoption of HD televisions to UHD without considering the conditions around the birth of each standard.
The marker in time that the study uses as its starting point is a bit bogus and doesn't take into account other mitigating factors at play. The first edition of the HD broadcast standard, ITU-R Recommendation BT.709, was approved way back in 1990. At that time, what we refer to as standard definition ruled the perch and, historically, there hadn't been a major change in broadcast standards since the advent of color transmission back in the 1950s in the US and 1960s in Europe. In addition, we were still in the days of analog; in fact, the United States didn't officially switch over to terrestrial digital broadcasts until 2009.
Compare the movement from SD to HD with the conditions surrounding UHD's arrival; we had already underwent one jump in resolution in recent memory. All of us over the age of 30 probably remember ditching the old tube style standard definition set for a flat panel HD model. At the time, it seemed like a revolutionary jump to the future of broadcast and display technology. Now, going from HD to 4K, we have a shared, collective experience of resolution platform change that informs, rightly or wrongly, the nature of this shift and expected pace of change. The shift from SD to HD took around 20 years; we expect to shift from HD to UHD to take generally the same amount of time. But this is a wrong-headed view, and I would suggest that we should focus on the current conditions surrounding this platform change, instead of simply comparing it to the shifts of the past.
Even casual observers would agree that technology is developing at an ever-increasing rate. But it seems we may be unwilling to project that into projections about the future for fear of reaching too far, as evidenced, in this case, by an earlier IHS Technology report on the UHD transition, which predicted that it would take 7-8 years to achieve that 7% global UHD shipment figure that was actually reached just one year after initial UHD standards ratification.
One thing I know for sure: we humans are amazingly adept at accepting new technologies into our lives in the moment. Who knows, three years from now, you might not even be able buy a high definition set, and the HD set you own may be seen as an antiquated device, or the perfect hand-me-down screen for your kids.