<img src="https://certify.alexametrics.com/atrk.gif?account=43vOv1Y1Mn20Io" style="display:none" height="1" width="1" alt="">

What is the state of the art in PC graphics?

4 minute read

NvidiaNvidia's GTX 1080 Ti: a peach but only a minor boost over the non-Ti version

If you look at the world of PC graphics from the perspective of the consumer, we are forced to conclude that last year was a bit of a let-down. Let’s be clear, the current crop of graphics cards are the best we have ever seen, and in particular the GeForce GTX 1080 is a peach of a GPU. However, that is only part of the story. [Opinion]

It is worth pointing out that during this look back at the events of 2017 we are sticking with PC graphics. In the world of Apple Mac, there have been significant moves with iMac, iMac Pro and the latest AMD graphics.However, we are putting Apple to one side. Anyone who bought a new iMac in 2017 and uses Final Cut Pro has a good reason to feel smug and self-satisfied, which only adds to our pain.

Over in the Land of PCs where we use Adobe Premiere or DaVinci Resolve, it is a different story as we are obliged to buy a graphics card to share some of the CPU workload. This leads on to the question of which graphics chip you buy, followed by the make and model of the graphics card. Gamers get hung up on the precise clock speeds of the GPU and its memory, so they can fine-tune the image quality settings of their games. By contrast, video editing hurls a workload at the PC of such a magnitude that the CPU and GPU must be able to crunch through the work as fast as possible. Once you have a decent graphics card (say a GTX 1080) in your video editing machine you will only see a minor benefit from installing a better graphics card (say a GTX 1080 Ti), whereas a move from a quad-core CPU to eight cores is a game changer.

The fundamental problem for the consumer is that Nvidia has recently overshadowed AMD in the graphics market, stymying competition to the extent that product development has slowed and prices remain uncomfortably high. If you cast your mind back three years to late 2014, you may recall the launch of GTX 980, which was a damn fine graphics chip. Nvidia rolled out the 10-series in 2016 and it's arguable that we haven’t seen any significant progress since then.

The GTX 1080 is a die shrink of the GTX 980 (28nm down to 16nm), which means it crams in many more shaders and is able to run at much higher clock speeds. There are other changes, such as the move from GDDR5 memory to GDDR5X and the revised GPU compressing data more efficiently to allow greater throughput. Balanced against that, the 10-series Nvidia end-of-lifed the 3-way and 4-way SLI and would much prefer that you use a single GPU and forget about SLI altogether.

The fact is that the GTX 1080 is better than the GTX 980. However, it is clearly derivative rather than a new architecture. It doesn’t much matter whether a new family of GPUs is based on a previous family. The point here is that GTX 1080 was only intended as a stop-gap by Nvidia as a way to test the new fabrication process before they moved from the Pascal architecture and onwards to Volta. To date, Volta hasn’t appeared for the desktop (although you can buy a Titan V for £2700) and Nvidia has instead made the GTX 1080 stretch throughout 2017. This is entirely due to their dominance over AMD and the - in my view - underwhelming Polaris RX 580 GPU, which struggles to compete with the GTX 1060.

The GTX 1070 and GTX 1080 simply have little competition and this has allowed Nvidia to maintain high prices. A basic GTX 1080 will cost you £500 while the GTX 1080 Ti will set you back £700.

Choices, choices...

There is at least choice from Nvidia. 

It could be that it introduced the Founders Edition cards, which is another way of saying ‘Reference Design’, and charged a premium at the same time. Ordinarily, a basic reference card is cheaper than the more advanced after-market cards.

Then we have the story of the Titan where Nvidia has launched a series of five GPUs with similar names and hugely different specifications. We started with the Titan and then went on to the Titan X, Titan X Pascal, Titan Xp and now the Titan V.

Or it could be the way Nvidia introduced a GTX 1070 Ti that slipped into a gap in the product stack just below the GTX 1080 and priced at £450.

In essence, Nvidia has spent a year selling the notion that the GTX 1080 is a high-end graphics card that deserves a price tag of £500 when I think it is actually a mid-range GPU that should be priced around £350. The consequence is that every other graphics card in their product stack is pretty expensive and while I admire the GTX 1060, I feel it is far too expensive at £300.

And just so we are clear: in the past year I have bought a GTX 1070, two GTX 1080s and one GTX 1080 Ti with my own money.

Over at AMD

The explanation for this situation is that AMD hasn't competed well with Nvidia recently, which is a pity, because the only possible competition to Nvidia is from AMD. 

I don't think the RX 4xx Polaris was ever going to compete at the high end but it looked competent in the mid-range. It was not helped by a move from GDDR5 memory to a new technology called HBM (High Bandwidth Memory) that limited AMD to using 4GB of memory and also raised the cost of materials.

In January 2017, AMD started to tease us about the new Vega graphics design that needed to compete with the GTX 1080 if it was to have a chance of success. We knew full well that Nvidia had its next-generation Volta lined up. If Vega looked good, Nvidia would launch Volta as a riposte. But it seems to have failed to match the GTX 1080. However, a bigger problem was that clock speeds and power draw had been cranked up in an attempt to match Nvidia with the result that the Vega is slower, hotter and more demanding than a Nvidia Pascal. And prices were very high at £500 for Vega 56 and £600 for Vega 64.

At the time of writing, you still cannot buy after-market Vega cards from Asus that were supposedly launched back in September.

The current rumour is that Nvidia has cancelled Volta and has moved on to Ampere which is supposed to use 12nm FinFET technology. When will Nvidia launch Ampere? Who knows. They are under no pressure.

The best we can currently hope for is that AMD will roll out a die shrink of Vega that contains the unusually high power draw and delivers better performance. 

So, it seems like this recent round goes to Nvidia. We'll have to see what the rest of 2018 has to bring as it unfurls

Tags: Technology

Comments