<img src="https://certify.alexametrics.com/atrk.gif?account=43vOv1Y1Mn20Io" style="display:none" height="1" width="1" alt="">

The rise and rise of the almighty GPU

4 minute read

NvidiaThe current state of the art: an Nvidia Pascal architecture PGU

In our continuing series of articles about the GPU, we delve into the history of the GPU and its evolution into a more general-purpose computing platform.

We've talked about the way in which modern GPUs (that is, the chip at the heart of a graphics card) provide processing power in a very different way to traditional CPUs. One of the most interesting aspects of this is that we now use devices that are the descendants of those designed solely for rendering visuals, and we use them for tasks that aren't necessarily anything to do with pictures at all. Okay, so video games still use GPUs to render graphics and many of the applications in film and TV are picture-related, but the way in which the hardware is used is often rather different. The expansion of specifically graphics-related hardware into general purpose computing is something that wasn't necessarily intended, at least not for most of the history of the technologies concerned.

Parallel beginnings

The very earliest hardware that was designed to perform parallel operations, in which the same sums are done on an array of input numbers, probably dates from the 1970s. The famous Cray supercomputers used parallel computing as a way to get more performance out of comparatively slow hardware, simply by using more of that hardware all at once. This technology was overtaken by the enormous R&D efforts put into general-purpose CPUs, though, and faded from prominence in the early 1990s. Almost simultaneously, the precursors of graphics processing units were being developed for video games, a market attracting much more R&D than any of the individual industries in which GPUs are now used for general purpose computing.

The_famous_Cray_line_of_supercomputers_which_persists_to_this_day_used_parallel_processing.jpgThe famous Cray line of supercomputers, which persists to this day, used parallel processing.

Naturally, video game systems since the 1970s have had video outputs and they've required hardware to drive those outputs, but the recognisable forerunner of the modern GPU probably didn't appear until the late 80s or early 90s, with devices from Taito and Namco intended for arcade games. Perhaps the best-known device with hardware specifically intended to render 3D graphics was the Sega Model 1 arcade system board, which was behind the well-known Virtua series of games: Virtua Racing, Virtua Fighter and only four others because it was so expensive. The board included five Fujitsu TGP MB86233 chips which were designed to rasterise polygons -(that is, convert them from 3D coordinates to a 2D image). Since any useful scene in a video game includes lots of polygons and the game must display animation over lots of frames, there are lots and lots of polygons to repetitively render.

iD_softwares_Doom_is_famous_but_didnt_use_GPU_acceleration_and_is_only_debatably_true_3D_in_any_case.jpgiD software's Doom is famous, but didn't use GPU acceleration (and is only debatably true 3D in any case).

In the following year (1993), iD software released Doom, which is one of very few things in technology which can be unblinkingly described as revolutionary. To be precise, Doom wasn't quite true 3D in the way we now think of 3D graphics, but it did an exceptional job of simulating the same result. What it didn't do, however, was run on any sort of specialised graphics hardware. It ran on the CPU, using some very clever tricks to maintain any sort of playable performance on the CPUs of the day. The first consumer product to include 3D graphics acceleration was comparatively obscure: the Super FX chip in the Nintendo game Star Fox. The extra hardware allowed the Super Nintendo to produce 3D graphics that its extremely modest Ricoh 5A22 processor, clocked at a princely 3.68MHz, could never have managed alone. While Matrox released the Impression Plus card the following year, it wasn't until the Playstation and 3dfx Voodoo series that anything recognisably like modern GPUs became available at consumer prices.

General-purpose computing

Moving from computer games to general-purpose computing wasn't perhaps the most obvious evolution for GPUs. The ability to do extremely high-performance graphic operations was recognised early on, with Matrox releasing the RT2000, with the ability to do advanced video tricks. This still relied on using the graphics card to do graphics work, however, and it wasn't until people started getting creative that the wider potential began to be recognised. Early tricks involved treating general-purpose data as an image and uploading it to the GPU to be treated as a texture map, whereupon the GPU could be used to do some types of mathematics on it. This was quickly shown to work so well that more general-purpose approaches were developed and here we are.

Resolve_on_a_home_computer_couldnt_exist_without_games.JPGResolve on a home computer couldn't exist without games.

In the intervening time, general-purpose computing on the GPU has arguably fed back into the design of the GPU itself, which has become less specifically oriented to graphics processing and more general-purpose, although most still include hardware that leans heavily towards the sort of operations required for 3D rendering and image mapping. The capability of GPUs has become so significant that some postproduction software has been able to leverage it as a way of producing final renders, returning to the idea of using graphics hardware more directly, to render graphics for film and TV work.

If there's a conclusion to be drawn from this still-evolving story, it's that a wide range of previously extremely difficult things have been made trivial. The canonical example in film and TV is colour grading, which used to be the domain of extremely expensive custom hardware. That hardware has now been made utterly obsolete by pocket-money commodity GPUs, to the point where even quite modest modern graphics cards, far from the top of the price range, will do all most people will need. On beyond-HD material, at high frame rates or when using particularly demanding techniques such as temporal noise reduction, it's quite possible to run a modern GPU out of performance, but in general the performance available is absolutely staggering. Fields such as medicine, where GPUs are used to reconstruct 3D imaging of people, have benefited to a similarly vast degree in terms of hardware cost.

More performance may be a solution looking for a problem, especially as the effort required to develop ever higher-fidelity 3D environments for games may serve to limit their hunger for power.

But then, they said that about the laser.

Tags: Technology

Comments