Computer technology has reached a point where the need to upgrade constantly is no longer there. It's good for the environment, and it's good for our wallets. Why is this?
I've just had to dispose of a pile of laptop computers. They were all my partner's, and they were ones that I'd advised her to buy. I'm pretty sure my two main criteria were that they should be cheap but not rubbish. Not an overly technical stipulation, then.
But why were there three of them destined for oblivion in the space of about ten years (dating from roughly 2005 to 2014)?
It's mainly because, actually, they were a bit rubbish. Even though they might have been able to boot up and run a web browser - and even do more demanding stuff - they weren't very well built. But there's a reason for that.
It's because they weren't expected to last very long, because of two fundamental weaknesses, and one characteristic of computer technology at the time.
First, they used spinning hard disks. If these were "enterprise quality", you might expect them to last for more than three years. But a tiny drive in a cheap consumer laptop? Well, statistically, a few might last longer than that. But would you want to risk it?
That's hardly surprising. They were complex electro-mechanical devices that don't like being moved. In a portable computer. Now that we have SSDs for main storage in laptops, it could hardly be more different. I'll come back to that in a minute.
The second thing was the screens. Before LED backlights, computer screens had to rely on florescent lighting. This worked surprisingly well, until it didn't. LEDs have massively boosted the lifespan of computer screens.
Even small laptops such as the MacBook Air will now give good performance for most tasks, enabling it to last much longer as a useful device - Image: Apple
Less pressure to upgrade
Meanwhile, the mad rush for faster CPU clock speeds has run out of steam. Actually, it did about ten years ago. Which means that, even though processors continue to improve because of multiple cores and cleverer architectures, the almost daily increases in clock speeds have petered out. There's less pressure to upgrade because of this than there used to be.
I only thought of this when, the other day, I realised that I'm still using an eight year old Macbook Air as part of my current setup. Admittedly it had an issue with the battery which Apple fixed as part of its obligation under EU consumer law, but with that sorted out, it's as if it's almost a new computer.
None of which is to say that modern computers aren't better. They obviously are, but the thing is that if these older computers have LED backlit screens, and have solid state storage (not infallible either, but less likely to be troublesome because it won't, you know, start spinning), then they will have longer useful lives.
Part of this is because almost any modern computer can do almost anything that's required of it, as long as it's not something that needs cutting-edge technology. If you just want to write a bit, do your emails, watch a few videos, and definitely don't want to play the latest games, then there's absolutely very little reason why an 8 year old Macbook Air wouldn't do the job. My Air is even good for some mild HD video editing (it will support a 4K screen if you plug one in, although it will overheat at that resolution within a few minutes).
There's another thing too. Being able to connect to the cloud.
I use Dropbox, as well as a few other cloud services. In fact, I don't think there's anything that I don't have backed up in the cloud now. So I feel better about using older computers. As a writer, it's quite possible that what I'm writing could be worth more than the computer I'm writing it on. There's no way I want to take that risk - so everything I do is automatically saved in at least one, and often two, cloud services.
And if something does go wrong, I know that I can get up and running with a new computer almost instantly, without any significant loss.
All of which adds up to a new way of using computers.
Computing - including personal computing, is never going to stand still. But I do think this is a good trend. I just more people would behave like this with their smartphones, too, which, after all, are the new personal computing platform.