<img src="https://certify.alexametrics.com/atrk.gif?account=43vOv1Y1Mn20Io" style="display:none" height="1" width="1" alt="">

Technology is absolutely not slowing down

Technology is absolutely not slowing down
3 minute read
Technology is absolutely not slowing down

Replay: Is technology slowing down, as some reports would have you believe? No. Absolutely not.

Technology is absolutely not slowing down
Image: Shutterstock.

I read an article recently that said that the current wave of technological progress is slowing down: even to the extent that there won't be enough computing power for the next generation of AI. In my view this is simply not correct, and the reverse is true.

As evidence for the alleged slowdown, the author said that he was still using a six year old Macbook. He made the point that he didn't feel compelled to upgrade and that (to slightly oversimplify his thesis) this proved the end of Moore's law, which is the empirical observation - and not actually a law - that we tend to be able to stuff double the amount of electronics into a given space every eighteen months/two years - whichever version you prefer.

The effect of this has been that for the last three decades or so we've been able to rely on computers and electronic devices getting smaller, faster and cheaper for a given level of performance.

And it is almost certainly true that Moore's law is starting to struggle to keep up with its reputation. Some on-chip connectors are now only 18 atoms wide. Beyond this point weird and unpredictable stuff happens because of Quantum Physics. More to the point, it's hard to dissipate all the heat that's generated by having so much going on in such a small area.

MacBook Pro 2012.jpg

Your older computer might well do what you need it to, but that doesn't mean technology hasn't moved on immensely.

A laptop is not a good unit of measurement

The apparent reality is that desktop and laptop computers haven't been getting faster, as fast, for over a decade, but that's not where we should be looking if we want to find exponential progress alive and kicking.

We'll come back to that in a minute, but first, I want to talk about a phenomenon that's making a number of commentators think that the exponential era is over. It's simply that - for example - computers have been easily fast enough for most everyday work that we don't need them to be any faster. Which, wrongly, leads users to conclude that because there's no need for faster devices, that they don't exist.

Do you need a faster computer to write your emails? Almost certainly not. What about writing and recording music? Nope. I routinely use an eight year old Macbook Air for music composition. It's not the fastest machine, but the point is that audio is relatively undemanding for even a fairly old computer. We passed the stage when it was difficult a long time ago - maybe twenty years.

I'm not being unrealistic here. I don't work with huge musical projects that call for dozens of plug-ins, software synths and a massive number of audio tracks. Would I like a faster computer for my music work? Yes, of course. Do I need one right now? No.

Real exponential progress

What about video editing? HD-level video probably works on most modern computers, and even some that are three or four years old. What about 4K? Nope. You need something modern for that. This is the point at which you start to need progress. 8K? Don't think about it without a system that can practically levitate. But such powerful systems do exists. I currently have a Gigabyte AERO laptop with a 4K OLED screen and a ray-tracing NVIDIA card. There's enough there to suggest spectacular progress. Real time ray tracing on a laptop? Just two years ago that would have been impossible, but now it is.

Gigabyte AERO laptop example.jpg

The Gigabyte AERO laptop laptop would have been impossible only two years ago

Do you see a pattern there? 4K is 4x HD. 8K is 4x 4K and 16x HD (and around 85x SD!). On top of this we have higher frame rates and greater bit depths. It's possible that computers today can handle 100x or more the data rates that were around 20 years ago when we first thought computers were slowing down.

But here's the thing. This increase is not simply due to faster processors. Some of it is, and some of it is because of better processor architectures. But it's also due to a myriad other factors that influence the overall performance of a computer system: memory bandwidth, GPU speed and - again - architecture. Multiply all these and more together and you arrive at cutting edge computers that are astonishingly fast. We've even started seeing Apple adopt FPGAs as an option in the new Mac Pro. These programmable logic chips essentially run software at hardware speeds for another significant speed boost.

Speaking of software, this is improving too. Even if hardware was "frozen" at its current level for ten years, we'd still see progress, as software techniques and frameworks get better, and also because connectivity has reached the stage where distributed processing (via the cloud) is available to virtually everyone. While this may not actually speed software up per processor, it does mean that we can do more with it. Look at applications like Frame.io. It's a cloud-based video sharing and review system with incredible functionality, that doesn't actually rely on hardware platforms getting faster.

Tags: Technology

Comments