<img src="https://certify.alexametrics.com/atrk.gif?account=43vOv1Y1Mn20Io" style="display:none" height="1" width="1" alt="">

Is Intel in danger of becoming irrelevant?

5 minute read

Intel / RedShark NewsIntel may be feeling pressure from all sides.

Opinion: Intel is facing increasing and unprecedented competition on all fronts, so how does it ensure its legacy and keep its place at the top table.

For the past few years, personal computer sales have been declining across the board. Every major vendor has been reporting declining sales and, in addition, the sales that vendors are seeing are increasingly shifting to lower end computers. The consumer priority has been shifting toward increasing battery life and increasing portability rather than toward more performance.

Also during this period, Intel has had some difficulties ramping up its new 14nm FinFET process, leading to cancelling desktop Sky Lake processors, leaving the high-end the older Haswell processors.

Exacerbating this is the fact that the main difference between Haswell and Sky Lake is an enhanced floating point, adding support for a fused multply-add instruction (FMA) originally made famous in Apple's benchmarketing efforts while promoting Altivec. The main improvements that Kaby Lake brings to the table are lower power consumption and a better GPU when compared to Sky Lake.

While this has been going on, AMD and nVidia have continued to make major upgrades to their GPUs; unlike the shift from Haswell to Sky Lake and Kaby Lake, nVidia added a huge performance boost, in addition to significantly lower power consumption with Pascal, its update to Maxwell. The difference is large enough that it's actually both more cost-effective and a larger upgrade to replace a 900 series nVidia GPU with a 1000 series GPU.

Topping this off is the external GPU pioneered by Razer; taking advantage of Thunderbolt 3, the Razer Core allows a user to add a desktop GPU to a laptop externally. This enables even Razer's smallest and lightest ultrabook, the Razer Blade Stealth, to offer desktop class GPU performance.

The ARM Factor

It's no secret that ARM is eating Intel's mobile lunch. Intel has made a number of attempts to get its processors into mobile devices, but it has yet to compete effectively with the vast ARM ecosystem, which is now competing with and, in some cases, even eclipsing the performance of Intel's low-end desktops. With Samsung using a 14nm FinFET process to manufacture its latest Exynos processors, it's becoming clear that Intel is losing its fabrication technology advantage. On top of that, ARM is designed to be power efficient – it has a large edge over x86 in the ultramobile space.

With modern ARM processors sporting as many as eight cores, dedicated floating point hardware in some higher end models including vector processing units, integrated GPUs with tile-based rendering and even 64-bit addressing, it's not hard to see why Intel's Atom hasn't made much of a foothold.

Even Microsoft has been doing some experimenting using ARM processors to build high-density server farms. Qualcomm, for example, is developing an ARM-based System on a Chip (SOC) based on ARM, including in its initial prototypes 24 cores, with plans for more cores in the future. Performance wise, Intel's Xeon still has a significant edge over ARM, but it's starting to look like ARM is getting within spitting distance. With companies like Broadcom and Qalcomm, as well as AMD and several Chinese cores in design like Phytium (all aimed at exascale datacenters the likes of Google, Amazon, Baidu, and Facebook), it certainly seems like ARM has the potential to shake the server industry up the same way that Intel did with the Pentium Pro. AMD has launched its Seattle core, a new Opteron processor, also aimed at servers.

It could also end up being pushed out of the market before really breaking in by the OpenPOWER initiatiave along with AMD and Intel. It's too early to say for sure.

One thing is for sure, however: ARM isn't down and out, despite not making any headway in the datacenter market so far. It is, however, making some inroads into supercomputing, with none other than Cray announcing an ARM based supercomputer design called Isambard for the UK Met Office, featuring over 10,000 64-bit ARM cores.


The AMD Factor

AMD's influence here actually cuts both ways. Intel's GPUs, in spite of their improvements, still rate pretty low on the computing totem pole. But AMD's main processors are still pretty thoroughly outclassed by Intel's.

AMD is hoping to address the latter with Ryzen, which looks to be competitive with Intel's desktop processors, based on the initial demos and benchmarks. If AMD is able to capture significant market share from Intel, consumers might finally get to benefit from the competition again; Intel would likely end up lowering prices and also step up to the plate with more aggressive designs once more.

Significantly more bizarre is the possibility that Intel and AMD might also be striking an alliance. Hardware website HardOCP is pretty confident about its source, but until AMD and Intel go public one way or another, the fact that Intel might start incorporating AMD GPUs into its processors, replacing its own mediocre GPUs with AMD's much more powerful GPUs, is just a rumour, albeit a rather interesting one.

The MIC Factor

One of the traits that makes GPUs such a problem for Intel is that modern ones incorporate huge numbers of simple processors that are optimized for computing oriented tasks. Intel's attempt to match these has been its Many Integrated Cores (MIC) processors, sold under the Xeon Phi brand. These are based on simplified versions of Intel's P54C core, with higher clock speeds and the SkyLake FPU, including the new Fused Multiply-Add instruction. With current models sporting in the realm of 200 such cores, on the surface they're not in the same league as nVidia's Pascal GPUs. Core counts don't tell the entire story, however; each core in a Pascal can run one instruction at a time on one piece of data, so a group masquerades as a single SIMD processor, while the cores in Xeon Phi can execute SIMD instructions natively. Rather than being outclassed by orders of magnitude, Xeon Phi is only outclassed by a factor of two or three.

Intel is counting on being able to achieve a higher utilization for the cores in Xeon Phi, enabling developers to use a higher percentage of the Xeon Phi's peak performance than most will be able to get with a Pasacal, thus leveling the playing field to some extent. How well this will work remains to be seen, but initial results show some promise, since there are some Top 500 supercomputers built using Xeon Phi.

Since the current Xeon Phi models can host a full operating system, it's possible that we'll start seeing workstations based on Xeon Phi on the market. These could be combined with powerful GPUs, but it's unlikely that Xeon Phi will offer as much single-threaded performance as contemporary Core or Zen processors, making a Core or Zen a better choice as the host processor. The high single-threaded performance combined with a large number of compute cores provides system balance; tasks that are parallizable can be executed on the massively parallel GPU, while tasks that do not parallelize well can be executed on the fast host processor with small number of cores.

In other words, Xeon Phi is a niche product.

What's next for Intel?

So, what will Intel's next move be? For the near term, Intel's most likely plan is to continue developing x86 processors and, if Ryzen is competitive enough, it might force Intel to lower its prices as well as to step up development of new processors. Both of these would be good for consumers, but it's very likely that GPUs will continue to overshadow the host processors, which will force Intel and AMD both to lower their prices.

With the lower end of Intel's market getting eaten away by mobile processors, higher end sales shrinking and AMD getting back in the game, it might seem as if Intel's days of dominance are numbered. It definitely seems as if x86's dominance in personal computing is starting to fade, though it's not really threatened yet in the still lucrative server industry.

It's a sure bet that Intel is aware of the potential in ARM-based servers, as well as the declining sales in the consumer market. Intel is probably also keeping a close eye on Ryzen and, most likely, already has plans underway to keep its future secure.

What those plans are is an open question. Intel hasn't always been particularly successful in branching out in the past; it's tried to get into GPUs several times, never with much success outside of simply giving the GPU away. It's also been very unsuccessful at getting its products into phones and tablets for the most part, with Surface being the only exception so far.

The feeling is that something has to give. But what? It's going to be an interesting few months ahead.

Tags: Technology

Comments