<img src="https://certify.alexametrics.com/atrk.gif?account=43vOv1Y1Mn20Io" style="display:none" height="1" width="1" alt="">

The future of GPUs: Intel is getting serious [Opinion]

2 minute read

Cristiano SiqueiraA concept image of an Intel Xe GPU

Intel's new proposed Xe architecture shows that the company is getting very serious about graphics.

Nvidia's hardware raytracing has had a mixed reception throughout the consumer community. To add hardware raytracing, Nvidia used hardware that could otherwise have been additional CUDA cores to improve computing performance and the same is true for the tensor cores.

Since Nvidia's only competition in desktop GPUs has been AMD and AMD has not been competitive, it's a risk that Nvidia was in a good position to take. It's possible that the RTX and Tensor cores will end up being a failed experiment, but more likely that they'll become a standard feature in future GPUs.

AMD's response to the Nvidia's Turing has been, while impressive, fairly disappointing for most creative applications. Nearly all of the new GPU optimised applications and plugins available are CUDA optimised, including Red's new GPU accelerated SDK, SGO's Mistika Boutique and most of the BorisFX plugin suite.

This CUDA bias is another symptom of the fact that Nvidia doesn't really have solid competition. AMD's Radeon VII has an enormous amount of computing power, but since AMD has neither the new hardware features nor the developer support to work with third-party developers, the third-party developers aren't getting on board.

Navi GPU

What AMD has revealed so far about the upcoming Navi GPU is that it's aimed primarily at the mainstream market rather than the high end. While less than happy news for AMD's fans, even worse news is that the Navi will likely debut in the PlayStation 5 with the desktop versions following after. That implies that with the PlayStation 5 launching in 2020 rather than 2019, we might not get a Navi desktop GPU until 2020 at the earliest and even then it seems that not even AMD expects it to be competitive.

While disappointing news for AMD fans, it's still good news overall since AMD has the contracts for the PlayStation 5 CPU and GPU and, most likely, also for the next generation Xbox, so it's going to earn a lot of money with these, which is the obvious reason that the console parts are a higher priority for the Radeon Technologies Group than the desktop models are.

Nvidia's domination might finally be coming to an end, however.

In 2017, Intel hired Raja Koduri, the chief architect of the Radeon Technologies Group and he's now the senior VP of Intel's Core and Visual Computing Group. Unlike AMD, Intel has plenty of money in spite of its recent troubles.

At the FMX conference in Germany at the beginning of May, Intel made a big reveal for the future of what it is calling the Xe or “Exponential” Architecture: it's going to include support for hardware ray-tracing.

Intel is working with third-party software developers to develop its Rendering Framework, which currently includes:

  • Embree, a ray tracing kernel
  • OSPray, a ray-trace rendering toolkit that includes path and volume tracing, designed to be scalable and supporting clustering
  • OpenSWR, an OpenGL rasterisation library
  • Open Image Denoise, which does precisely what it sounds like it should.

In short, Intel is doing precisely what has enabled Nvidia to become so dominant with the exception that it's making its libraries open source. That does imply that both AMD and Nvidia could port that framework to their own products, should they choose to do so.

Even if the Rendering Framework remains an Intel-only toolkit, at least there's a chance that it will lead to Intel being a solid competitor for Nvidia, which can only be good for the customers.

Tags: Technology

Comments