<img src="https://certify.alexametrics.com/atrk.gif?account=43vOv1Y1Mn20Io" style="display:none" height="1" width="1" alt="">

Why are modern interfaces not as responsive as we'd expect?

3 minute read
Shutterstock.

Despite huge advances in technology, why is it that some modern devices can still feel slow than a ZX Spectrum from 1982?

Press a key on a ZX Spectrum released in 1982, and a character is very likely to appear on the screen no more than one video frame - that is, one twenty-fifth of a second, or 40ms, later. That's fast enough to feel more or less instantaneous, but the same often can't be said for modern devices, which can sometimes feel - well - laggy.

That's not what we'd expect given what's happened to technology in that time. At the time of writing the cheapest Google smartphone, a reasonable mid-market option, was the Pixel 4A. It boasts a Snapdragon 730G processor with eight 64 bit cores, four clocked at 2.2GHz and four at 1.8GHz. We could try to calculate just how much clear water there is between that and the Spectrum's single 8-bit 3.5MHz Z80, but I'm we can probably agree that there's a lot. Do smartphones like the Pixel respond to screen taps within 40ms?

Sometimes.

Then there's sheer size. Viewing a Yahoo Mail account involves about 7.4MB of "sources," as the Chrome browser puts it. Establishing the amount of data actually involved is complicated, but that's likely reasonable maximum; most requests will be much smaller, and carefully so, because bandwidth costs Yahoo money. Compared to the size of the same site at launch in 1997, it's vast, and while it has a lot more features now, it does more or less the same things as the Yahoo Mail app for iOS 13 or later, which, Apple informs us, weighs in at a sturdy 245MB before it's started storing emails. Why a phone that already has a TCP/IP stack and a huge variety of built-in display subsystems needs 245MB more resources in order to read emails based on early 1970s technology is a question worth asking.

Pixel 4A
The Pixel 4A. Image: Shutterstock.

Why does it happen?

There are a few reasons for all this.

On that old ZX spectrum, depressing a key pretty directly causes a (very) small piece of built-in code to be executed that writes a character into memory, and when the display system is next generating a frame, it reads that memory as the monitor's electron beam sweeps across the display and assembles the graphic of the character using some hardwired silicon. The Spectrum does not have multiple layers of software involved in reading the keyboard and filtering the resulting event to the right piece of code, nor does it build the image into an area of memory, then scan that memory out to the display.

Modern computers, including phones, have multiple layers of code, most of which would exceed the memory limitations of a ZX Spectrum by hundreds of times, between the hardware that reads the touchscreen and the hardware which drives the display. Worse, some of that code may be written in interpreted languages such as Java (on Android) or C# (on Windows), or a language which provokes endless controversy over how interpreted it is: Objective-C (on iOS). Interpreted languages trade computer time for programmer time – they make code easier to write, but slower to run, a Faustian bargain.

Interpreted code tends to be bigger as well as slower, but that doesn't account for the increases in program size we've seen over the last few decades. Some of that is due to the tendency of modern software to use lots of big colourful pictures to form parts of its user interface, a regrettable development that emerged around the time of Windows XP. Another part of it is the tendency of software engineers to want to reuse things other people have made. That's fine; that's what modern functional programming is about, but it's easy to end up including huge amounts of code and resources as part of frameworks and toolkits which may not be used in full.

Software bloat

Least flatteringly, it's hard to avoid the reality that all of this is partly due to sheer carelessness. Things work because we have so much performance to waste, but it's painful to realise that at least some of the development in storage capacity, processor speed and network bandwidth has been quite simply absorbed by the lumbering behemoth of modern software.

This might come off as the hectoring of a real-ale greybeard, and to an extent, yes, some of the worst examples are the most recent things. Google Docs in Chrome is a piece of interpreted Javascript code, involving several large Javascript libraries, running on top of a web browser's rendering engine, running on top of several layers of graphics APIs, running on a computer, and it feels slower than Digita Wordworth on an early-90s Amiga which had many of the same features. Some - many - of these layers are a necessary concomitant of modern operating systems, but it's hard not to get the impression that something has been lost in the rush for features. 

Fixing all this is technologically possible, although a complete solution would require a ground-up rewrite of most of modern computing. Still, much could be achieved by an even slightly more rigorous attitude to efficiency.  It doesn't seem too much to ask that user interfaces seem at least as responsive as they did thirty or forty years ago.

Tags: Technology

Comments