RedShark News - Video technology news and analysis

What happens when cameras become "good enough"?

Written by David Shapton | Nov 7, 2014 12:00:00 AM
Sony F55

We may be reaching the point where the tools, gadgets and gizmos that we love to buy are simply good enough. This is the first of two articles (the second will be by RedShark Technical Editor Phil Rhodes) about what happens when our technology simply gets to the point when we don't actually need any improvements.

I've owned a laptop for twenty five nears now. Not the same one, I hasten to add. In fact the my first laptop probably weighed as much as the table I'm using my current laptop on now - and this table probably has more chance of running Windows than that old thing. But at the time it was great. It didn't really matter that it took several minutes to boot. The screen wasn't even black and white - it was amber on a sort of dull orange. It had a lovely keyboard with keys that were pretty much the same as on an original IBM computer: big, chunky, and satisfyingly noisy! Battery life was, let me think... actually, it was zero. You had to power this thing from the mains.

So you can see that there was plenty of room for improvement there.

Ever since, I've upgraded my laptops every two or three years.

New Generations

Something that's pleasing and depressing at the same time is that when a new generation of laptop comes out it quickly makes the previous generation seem completely obsolete, and this happens on several fronts. Storage has been growing all the time. I've always found that whenever I buy a new laptop, the new one has enough extra storage for me to copy and keep all the data from my previous computer. Two things have changed this recently: all of my data is stored on a cloud service so I don't have to transfer it from one computer to another. The second thing is that Flash memory is more expensive than hard disk storage, so I've had to get used to smaller capacities, but that's not such a big deal with cloud storage.

And of course, operating systems change. At some point you will discover that your old computer won't run the latest OS. Staying with an old operating system is not a sensible option once the updates stop.

But the last computer I bought, nearly four nears ago, is doing fine. There isn't a single thing about it that doesn't seem as good as when I bought it. Well, let me put that another way. Computers don't actually get worse, it's just that newer ones are continuously getting better. But with my latest computer, a 2011 Macbook Air, there doesn't seem to be any serious reason to upgrade it.

Partly this is because it is made from components that will last a lot longer than their predecessors - LED-backlit screens, solid state memory etc; but it's also because it was designed and made just past the point of being just good enough.

Now, being "good enough" doesn't exactly sound like a glowing endorsement, does it?  But ask yourself (as I have): how "good" does it need to be, at the point where it does everything you want as well as you need it to.

I had to have one

The Macbook Air is a great example of this. When it came out, I knew I had to have one. As a writer, the thing you want most is a computer that you can take with you without hesitation. It has to be small, lightweight, and fully functional. No need for video editing, compositing, 3D rendering: just the core activities of browsing, email and - writing.

Well, it was good when I got it and it has stayed good. Absolutely the only sense in which I would consider upgrading it is for a Retina screen.

So maybe that's the catch: even with this computer that I still absolutely love, it still could be better.

But, actually, that's not the point at all. Because if Retina screens never existed, I would still be happy with this, because, again, it is good enough.

This is is a fairly new phenomenon. Normally the pace of technology moves so fast that we're always tempted by the latest thing. So is this vortex of change slowing down at last? No. Not at all. If anything it's speeding up. (Don't be fooled by the fact that processor clock speeds aren't keeping pace - other developments are making up for this).

But the phenomenon that I'm talking about here - that your current gadget is good enough - is probably going to become more commonplace. And this has big implications for all technology industries.

They want to make it better

Let's face it, the reason that most business are in business is to make money. They primarily do this buy selling stuff, and they make you want to buy it by making it better than the stuff you bought last year.

So that poses a difficulty, because if you, the customer, think that what you own is '"Good enough" then that's a problem. Sales will drop. And that's exactly what's happening in some areas, like smartphones.

You can see how this is happening. The Galaxy range of phones from Samsung were popular until they became too good. At this point, even after a two year contract, users felt that even though the newer versions might have some improvements, their current phones were still working well. What's more, with operating systems (like Android) refreshing even two year old phones - and to some extent making them feel new again, the manufacturers are really up against it. They'd have to include a time machine or anti-gravity to really get people's attention.

So I don't need to update my Macbook air, and I will probably keep my phone, off-contract, and just buy a plain SIM. I'll save myself money that way, and still be happy.


What about professional products?

So, that's the world of consumers. But what about professional products?

Well, I think the same thing is likely to happen, and the question is not as blunt as "how can the manufacturers keep you buying stuff" but is more nuanced. It's become "what sort of innovation is going to tempt buyers to abandon their already perfectly good cameras, post production software, computer workstations, etc.".

Well, first of all, I hope they won't make us upgrade through sheer supportability. For some time now, older products have been susceptible to obsolescence though sidelining. In other words, they're left behind not because they essentially stop working, but because things have moved in a different direction and left them irrelevant. This can happen for "good" reasons - no one really complains that they can't use their SD cameras any more when the world has moved on so far. But there's a different, more insidious kind of sidelining, where your device has been left behind because it is no longer supported.

A tripod will always work

How much this matters depends approximately on how big a part software plays in the operation of your device. A lens is probably always going to work to some extent, and so is a tripod. But when software is important, it's a potential problem. My MK 1 iPad still works perfectly. But it keeps crashing. This is nothing whatsoever to do with the hardware, which is as good as the day I bought it around four and a half years ago. It's because Apple stopped supporting the MK 1 with their IOS upgrades some time ago. I'm stuck on IOS 5 and we're currently on 8. This does matter because the world keeps moving and what's expected of a device changes over time. With my iPad MK 1 "fossilized" by an older generation of IOS, all sorts of things stop working over time. The worst is web pages. A web page these days is likely to be a very complex piece of code, and when we adopt new web standards like HTML 5, the operating system and all its associated accessories needs to keep pace with the changes. If you are locked into an old OS, then many of the new web pages (and other things as well) simply won't work and your device is essentially "orphaned", and you'll find that a lot of the other things you used to use (for example, cloud services like Dropbox) might not work, as they too are updated and rely on the services of modern operating systems.

Cameras are different to computers, but increasingly less so. Modern digital cameras have a lens, a sensor,  digital electronics and some storage. Some camera devices are likely to be supported from a long time, and these are the ones from the "old" manfacturers like Sony and Canon for example. They tend to design cameras with specific chipsets and they have very tightly controlled specifications. Other manufacturers have gone down a route where their cameras will evolve through software upgrades. This is very good in most cases for users because their stuff will just keep getting better. Until the point, that is, where the manufacturer decides to stop upgrading them.

Does it matter?

But, to come back to the central theme of this article: does it matter? If you're happy with your device, why worry about upgrades?

This brings us back to the business cycle. If nobody upgrades, then nobody makes money, so, potentially, cameras could stop being developed, which would be bad.

Ironically, people might be prepared to pay more for a product that was well specified but which won't be out-dated for a specified time. If it cost more, then the manufacturers could afford to make it better, stronger, more durable. It would be kinder to the Earth as well. And then, after perhaps two or three times the "conventional" upgrade cycle, it would be genuinely worth upgrading again, because things would have moved on so far that it would be an easy decision.

Obviously, this would make a camera an even more expensive purchase, but perhaps there could be a finance arrangement combined with insurance and software updates - so for a fixed number of years your camera won't go out of date, it will get fixed if you break it, replaced if you lose it, and it will be easy to pay for.

So there it is. I can't believe I've actually said this but at least part of me wants products to be more expensive, better specified and supported for longer, so that when we do upgrade, it's genuinely worthwhile. Because I don't want to buy a new camera every year.