<img src="https://certify.alexametrics.com/atrk.gif?account=43vOv1Y1Mn20Io" style="display:none" height="1" width="1" alt="">

Should we fight to keep desktop computer development alive?

4 minute read

Apple / RedShark NewsApple's Mac Pro: One of the most powerful desktop computers ever.

While we enjoy the benefits of low power mobile processors every time we use our smartphones, Phil Rhodes urges further development for their desktop and workstation cousins.

A little while ago, we looked at the problems facing efforts to improve the performance of modern computers. The brute-force approach of ever-increasing CPU clock rates isn't something that can go much, if any further, but the alternative – more cores in each CPU – has problems of its own which mean that the year-on-year, very visible performance doubling of the late 1990s and early 2000s is gone and not likely to return in recognisable form. But beyond that, there's another, perhaps more insidious problem. Not only can we not keep the performance upgrades coming, there's an element in the computer industry right now that doesn't even want to.

Specialised (limited) devices

Think about tablets and smartphones. They seem fast, but really they're fast at a few specific things, such as decoding video and rendering 3D scenes. These capabilities – particularly the 3D rendering – make both portable and desktop devices seem fast in ways that they really aren't, inasmuch as the CPU of a Playstation 4 is orders of magnitude too slow to render its graphical output without the help of a dedicated GPU. They seem fast, in short, because of exactly the sort of hardware that isn't a universal solution to the overall performance problem. But worse than that is the fact that the processors in tablets and phones aren't being optimised for capability. They're being optimised for low power consumption and cost, right down to the level of having several cores of varying capability which can be individually powered-up depending on the processing load at any one time. This sort of shenanigans is why your phone lasts a day in your pocket, but wakes up more-or-less instantly on demand. It's very clever and laudable work, but it isn't making your After Effects renders go any faster.

Poor substitutes

The problem which is now becoming clear is that tablets and phones are replacing, in at least some circumstances, desktop PCs. In doing that, they're also displacing the enormous research and development efforts that made the enormous historical performance increases of desktop PCs possible. Desktops and workstations are as fast as they are because of the huge demand for office PCs. Video edit workstations have enormous GPU capabilities because of the mass market of video games. The market for big boxes to run big video edit and effects applications is nothing like big enough to support the research and development efforts required to actually create those boxes as they exist today. We owe it all to video games and office PCs, and that's been obvious since the late 90s.

A lot of equipment manufacturers spent a lot of time trying to claim that their high-end products used special high-end mathematics and – sniff! – naturally, the same work could not possibly be done on low-cost commodity hardware. In reality, it was already obvious to anyone involved in video games that things like colour grading were already being done on commodity hardware. As a result, people like Da Vinci, who made a living out of expensive FPGA racks, got absolutely steamrollered by cheap PC hardware that simply could not have existed without video games.

But the tablet and smartphone market doesn't offer us the same thing. A need for small, cheap, low-power processors to make Angry Birds work on a device that lasts a weekend on a charge contributes practically nothing to the quest for a faster Fusion render. Okay, there have been proposals to make multi-core CPUs using ARM processors, with the idea of leveraging the fact that they don't consume much power and don't get very hot as a way of packing more into a device. Computer science of the last ten or fifteen years has begun to put the lie to the idea that the smaller instruction sets are necessarily that much more power-efficient than larger ones, especially since most modern CPUs are effectively reduced-instruction-set emulators of more complex-instruction-set devices, using a technique called microcode. But even if that were to work, even if the low-power technology of smartphones made it possible to make more effective multi-core CPUs, well – that'd be nice, but for all the reasons we looked at last time, multi-core isn't a complete solution.

Processor plateau?

Companies such as Intel and AMD would, we can safely assume, reassure us that R&D on faster processors is proceeding apace, but it's foolish to believe that it's happening now at the same rate it was in, say, 2005. At that point, the world's two most prominent CPU manufacturers were competing healthily and the clock speed peak hadn't quite yet been scaled. Now, with Intel so abundantly ascendant and physics asserting itself, there would be more than enough reason to worry about R&D funding, even if the industry's collective eye wasn't being dragged so forcefully away from the ball by the quick profits of quick-turnaround products like phones and tablets. Even Intel has produced ultra-low-power versions of its Atom processors for Android tablets. It's not at all comforting for the industries out there who still need powerful desktop PCs. Desktop cannot become the poor cousin. For there to be a second screen, there has to be a first screen.

All of that is, I think, more than enough reason to suffer an increasingly specific sense of unease about the potential for future growth in workstations, especially as the capability of cameras increases in terms of both resolution and bit depth and makes ever greater demands on postproduction systems. We absolutely want faster smartphones and tablets. They're lovely and they make nice remote control devices for the big iron that does the real work, but at some point there has to be some big iron and it has to do some real work. Yes, you can edit on a tablet, yes you can shoot on a phone, but nobody wants to sit down and try to put together a 4K, 60fps HDR composite on one.

Though apparently, more-or-less everybody wants to sit down and watch 4K, 60fps HDR on one.

Tags: Technology

Comments