<img src="https://certify.alexametrics.com/atrk.gif?account=43vOv1Y1Mn20Io" style="display:none" height="1" width="1" alt="">

ARM chipsets will probably change our kit for ever

6 minute read

Qualcomm ARM will probably change everything

These days you often find that the most cutting-edge technology is in our smartphones and not our cameras. But the professional and the consumer markets do cross-fertilise, and it benefits us all

The smartphone boom of the last few years has made it really, really obvious that we're all carrying around full blown, general-purpose computers in our pockets. These are computers which are really quite some distance from the relatively limited things, the VCRs and Magimixes, that we were once told were theoretically computers.

This is interesting from the perspective of anyone who works with technology because smartphone tech benefits greatly from the commoditisation of mass consumer appeal and is therefore very cheap, as well as being designed specifically for portable, battery-powered applications. The entire field of tablet computing, for instance, owes its existence to the realisation that the phone form factor is a useful one when built at four times usual size. For one, you can write a slate application for it, and have the smartphone output timecode (albeit with some limitations on accuracy, as tests have shown).

It is of course arguable that Star Trek invented the tablet computer, and its Next Generation incarnation suggested the flat black form factor, but I digress.

This technology is what we're using to make movies

This has huge relevance to film and TV people because this technology is, especially now we're moving away from film, exactly what we're using to make movies. File-based acquisition is now practically universal and the information technology backbone is built out of exactly the same hardware used by everyone else. Intel does not, Xeon aside, make “professional” or “amateur” versions of its microprocessors. There is no clear dividing line as there was between 8mm, 16mm, and 35mm in film. And Intel is particularly relevant at this point, because for the past thirty years, as we slowly computerised all of our technology in ways both covert and obvious, Intel has been there all along, providing us with the ability to build desktop computers, workstations, servers, and computers in a box that we call “offload stations” and put in the camera truck to manage our data workflows.

 

SGI workstation - Wikipedia

 

There have always been competitors. The Silicon Graphics workstations of the 1990s used MIPS R-series processors, and until fairly recently Apple used non-Intel hardware, first Motorola 68000 series then IBM's PowerPC. But after Apple moved to Intel after 2005, with Commodore and Atari having failed in the home computer market in the 90s, only a few companies – including Cyrix and AMD – were producing microprocessors to do the heavy-lifting of the world's computation. That's still the case, and for the longest time it was difficult to imagine how Intel, or at least Intel-compatible hardware, could be dethroned.


It's all for Intel

Why? Because all of the world's best software is written for Intel processors.

What that means is fairly straightforward. Every type of processor has an instruction set, akin to a language, that it understands. Instructions may be as simple as adding two numbers, or as complicated as performing long division on a set of eight really big fractional numbers while keeping a note of the remainder. All processors share basic instructions, such as add, divide, multiply, and so on. But the instruction sets of different processor families are not identical – and this is the main reason why you cannot, for instance, run software for an Intel Mac on an old one equipped with a  PowerPC processor.

It is possible, as Apple have demonstrated, to translate (“port”, as we say) software between different processor architectures. Software is not usually written by manually compiling a list of processor instructions, which would be absurdly labour-intensive. Instead, it's written in a language such as C, which is really just a rather limited subset of English with specific punctuation which is designed to allow the programmer to specify the broad strokes of how software should work. A compiler translates the C code into lists of instructions that the processor can understand. Often, simply switching compilers can do most of the work required to make a program written in C usable on a completely different processor architecture. But to make it run reliably, and well, manual intervention is invariably required, especially in order to make sure the program makes best use of any high-performance features that are specific to the target processor. This can be an enormous amount of work, and it must be done separately for every program that needs to translated.

And to reiterate, the vast majority of all the world's software, principally that which runs under Windows and the Unix variants, is written for Intel's instruction set. So, it wouldn't be completely curmudgeonly to say that the reason Intel is used for desktop and workstation computers is not much to do with the fact that Intel makes good processors (though they do); we use them because using anything else would require an incredible amount of work to be done, work that nobody has any real financial interest in doing.

Except, possibly, ARM. The Cambridge-based company does not make processors, like Intel; instead, they produce designs, ranging from a simple instruction set reference to a full set of photolithography masks to make a CPU. ARM licenses these things to companies who need a processor, and the company may then design and manufacture them for a per-CPU fee. Topically, the image processing and camera-control devices that Canon call DIGIC include an ARM processor core, implemented by Texas Instruments. So, it's been possible for years to make a desktop PC out of an ARM processor, albeit at the lower performance and power-consumption point than that which Intel targets. Nobody has done this because there was never a worthwhile amount of software to run on such a machine. But with Google's Android providing a stable, consistent operating system, and smartphone (then tablet) app stores providing the financial incentive, software started to become available, and is now commonplace. The Samsung Galaxy Tab 2 comes with Polaris Office, a broad equivalent to Word, Excel and Powerpoint. [And don’t forget this: http://www.redsharknews.com/business/item/193-is-this-the-most-important-new-computer-for-a-decade]


An extremely big deal

This is potentially an extremely big deal. There are a lot of desktop PCs in the world which exist to run Word, Excel and Powerpoint, and the supply of general-purpose applications for Android on ARM is huge and continues to grow. That said, it's far from a done deal: ARM's designs are heavily optimised for low power consumption at comparatively low performance, and they don't have anything that can challenge Intel's most powerful chips. Intel, for their part, don't scale down as far as ARM do, although there looks to be fierce competition for the middle ground, where both Intel's “Centerton” Atom series and ARM's Cortex A9 are broadly comparable.

The principal technological difference between what ARM and Intel do is in their instruction sets. Simply put, Intel's approach is to provide a a rich set of instructions, each of which is capable of performing a comparatively large amount of work, but which requires a complex processor design that either limits speed or consumes a lot of power. Conversely, ARM's processors provide a more limited instruction set, but the simpler design allows for either higher speed or lower power consumption. This is the essential difference between complex instruction set computing (CISC) and reduced instruction set computing (RISC), and the relative benefits of each approach have been discussed for decades. The ability of a RISC processor to execute instructions more quickly is intended to offset the lower complexity of each instruction, and ARM are currently promoting their processors heavily on the basis that they provide better performance for a given amount of power.

Is RISC better?

It has long been received wisdom that this is the case. Recently, a paper by researchers at the University of Wisconsin concluded that it's very nearly a zero-sum game, and that there aren't really any very noticeable efficiency savings with RISC that couldn't be explained by other factors. Even so, ARM have a very mature selection of processor designs which are optimised to work at the lower power and performance point required by popular portable technology, in comparison to Intel's decision to push for the highest possible performance. This battle is likely to be hard-fought, with the reigning champion of high performance, with its enormous manufacturing capability, facing off against the current king of low power consumption, with its enormous installed base.
The interesting point is whether Intel might choose to licence ARM designs and manufacture them. They are certainly capable of doing so, at least as capable, and perhaps more capable, than anyone else. Intel operates, among other things, a foundry business, by which we mean that they already accept manufacturing work on other people's designs. They have made ARM processors before, in the form of StrongARM and Xscale. And as recently as last year, Intel was actually ARM's biggest customer, representing 7% of their revenue (presumably mainly based on patent licensing agreements). But right now, the flagship products of both companies are in competition. Intel have been optimising their low-power Atom series for mobile applications, just as ARM have begun to propose their higher-powered devices for servers.

All of this could be taken as a complicated way of saying that smartphone tech allows us to make smaller, cheaper, less power-hungry camera equipment, and the reality really is almost that simple. What this means for cameras is that things like Canon's DIGIC series should get even more capable, even cheaper, and even more frugal on power as market forces drive processor manufacturers , which should surprise nobody. What would be slightly more surprising is if we start seeing the widespread uptake of ARM processors in desktops and workstations. The technology isn't there quite yet, but the prospect doesn't seem impossible.

Tags: Technology

Comments