<img src="https://certify.alexametrics.com/atrk.gif?account=43vOv1Y1Mn20Io" style="display:none" height="1" width="1" alt="">

Cameras are so good these days. So how do you choose the right one?

7 minute read

How do you choose a camera? In this article, we use examples from everyday consumer life; smartphones, cars, computers, etc, to put the latest camera developmens into perspective. And we ask: are we reaching the point where you don't always need the latest and greatest?

A little while ago in RedShark I wrote an article called "Good Enough" in which I said that in some ways modern cameras were becoming so good at capturing images that we should be looking beyond mere resolution to choose between them. The point wasn't that all cameras are the same: quite the reverse in fact. There's more diversity than ever; it's just that if you use the currently predominant metric - pixels - to choose between them, you can't... choose between them.

I've been thinking a bit more about this. It goes deeper. And that's because everywhere we look, the devices that we buy are exceeding the needs of their users.

I bought a mobile phone the other day. I needed a spare in case my already flaky smartphone fell over completely when I was in Las Vegas for the NAB show. The fact that I'd dropped it probably didn't help but there are enough things wrong with it to make me think I shouldn't rely on it.

My phone is a Samsung Galaxy Note II: a  really great phone, but it's expensive to buy. Far more, in fact, than a basic or even a mid-range laptop. So I didn't want my spare to cost that much.

At the same time, I didn't want my spare to do significantly less than my main phone. I'm pretty much a power user of smartphones. I use them to make calls infrequently (that's what Skype is for!) but I do want my email updated instantly and I also dabble in social media (the RedShark posts you see on Linkedin, Facebook and Twitter sometimes come from my smartphone).

You don't have to spend too much

Luckily, I didn't have to spend much. There's a reasonably new phone called the Motorola G. It's a good candidate for me. It's got very nearly pure Android on it, and it looks good.  The trouble is that all the reviews say that it's not cutting edge and is a bit underpowered.

You might think that would have put me off. I don't normally buy things that are behind the curve - I'm too much of a geek for that! But the point is that the Motorola is not behind  *my* curve, because my other phone is already a year and a half old: prehistoric in smartphone terms.

So, to me, the Motorola G, which cost less than a quarter the price of the Samsung, seems pretty snappy. In fact, it seems fast. And it does everything I need it to do.

It's so good,  in fact, that I'm probably not going to renew my expensive contract for the Note II when it comes up. What's the point? The cheap and cheerful - but completely competent - Motorola does the job. It's good enough. Easily.

I'm sorry to have gone on at such lengths about smartphones. But the same thing is happening all over the place. I drive a car that's not made by a "prestige" make like BMW, Audi or Mercedes. It's a Ford. And even though I've driven (and owned) some of the other makes, I actually prefer the Ford. It may not have the same interior quality, but in terms of handling, reliability and sheer competence, it's as good and very significantly cheaper. If I can, I'll get another one when I have to.

It's tempting to say that across a wide range of consumer and professional products they've reached the point when they don't need to get any better. But there's more to it than that, because our expectations and requirements change almost as rapidly as the products.

The products change because we're in an era of exponential progress where the current generation of tools designs the next generation of more powerful tools, and where we can pack more and more onto the same size chip. At the same time we combine interconnectable technologies to make new ones almost out of thin air (think of Google Maps "Mash Ups").

One result of this heady cocktail: a truly virtuous circle - is that product cycles have shortened. When my parents were young, you'd buy a radio from the village shop, and when it broke down ten years later you'd get it repaired or buy a new one, and it would probably be the same model. Now in consumer electronics, product cycles are somewhere between six months and six weeks. It may take a given manufacturer a year to update their primary model, but in that time several competitors will probably have leapfrogged its performance and set a high barrier for when the original model is replaced.

Consumers get caught up in the frenzy surrounding product launches, and their expectations are raised, but these expectations don't necessarily correspond to what the consumer actually needs. You may think you need your next tablet to have the power of a desktop computer (and with Apple's latest "A" processor it probably almost does) but if all you do is browse the web and read your eBooks on it, you really don't need all that power.

Manufacturers recognise this and try to "improve" their products on all fronts: you might not need the speed, but perhaps you fancy a higher resolution screen, or want your tablet to be a bit thinner.

But the fact remains that the previous generation will probably do the job for you. If you can live with the shame of not having the very latest model, then you might as well buy the previous one.

There are exceptions to this rule. You probably shouldn't be buying Standard Definition cameras right now, unless you're retiring from the trade soon. Nor should you buy products that are not going to be supported. It's all too easy to buy cheap-looking I/O devices, for example, only to find that they don't have drivers for the latest operating systems. If you do make this mistake, essentially, you're buying a paperweight.

4K is very demanding

So what about cameras and other production equipment? Well, as far as computers are concerned, video - especially 4K - is very demanding. You should probably buy a computer as powerful as you can. If you're a professional, don't even think of saving money on graphics cards and motherboards - time is money and you'll simply be wasting it if you don't buy the latest and best - unless you do basic video editing and the setup you have in mid won't be taxed. You *can* buy cheaper gaming cards, and they can be very powerful, but they may not be completely supported by the big professional applications and you may find it difficult to get help if you need it.

Second hand software may do the job if you don't need the latest features, but if you're like me you'll find yourself pining for something that all your colleagues are using and which you know will make your life easier. I would go for a subscription in this case, which has a low capital outlay and means you always have up to date software (I don't side with those who think subscriptions are the worst thing ever invented. I would probably never buy the full Adobe Creative Suite, but possibly would pay the subscription. I should note that as the Editor of RedShark, Adobe *gives* me a subscription).

But what about cameras? This is where it gets quite complicated and I don't for one minute claim to have all the answers.

With cameras, you need to look at the current paradigm for the field you're working in. What do I mean by "paradigm"? Actually I'm not even sure it's the right world, but it's close enough.

 The current paradigm is what people require and expect you to use in your field of work, and which is widely recognisable to give acceptable results.

 I use the world "acceptable" advisedly, because it sounds like it should mean "only just acceptable" That's not how I would use it. For example, if you're making feature films that are candidates for Oscar nomination, then what's acceptable is actually a very high standard. Nothing less will do. Perhaps a better way is to talk in terms of what's accepted by your peers. If any of them say "it's a great film, but some aspects of the visuals had me concerned", then that's not acceptable.

 If you're making films that people are going to watch on their smart watches (ie on a tiny screen) then you can probably get alway with less, not forgetting that the better the quality you start with, the better even a highly compressed derivative of your content will look.

 What if you're making wedding videos? These still have to look good to your audience. But what is the audience expecting? They're probably going to want a certain "look" that's easily obtained with the sort of shallow depth of field you can get from a good DSLR.

 The point is that there are now very good examples of cameras for almost all fields and specialities. We're even ahead of the consumer curve with 4K, just as HD content creation led HD content consumption ten years ago.

 So why do we so eagerly await new models? I think we do for several reasons.

 First, it's exciting. It just is. It's always fun when a manufacturer surprises us with something new that changes things. Somewhat counter intuitively, this is going to happen more often. It's counter intuitive because you might think that just about everything has been invented by now. But it doesn't work like that: each new invention pushes the envelope even further ahead, and means that the next innovation is likely to follow sooner rather than later.

New 4K cameras are everywhere

Just look at last week's announcements at NAB: new 4K cameras everywhere - and some of them from manufacturers who only a few years ago were better known for making I/O boards than electro-optical devices.

 Next, we hope that they're going to be cheaper. That feature or capability that we've lusted after and that's only been available on cameras outside our budget is almost inevitably going to drive down into our price range (unless it's an exotic lens that takes someone six months to build by hand!).

 And finally, perhaps, we look to new camera releases to inform us (or reassure us) about the direction the industry is taking. After all, none of us want to buy a camera that's doing to take us down a format dead-end.

 With more choices than ever, it's riskier too. What it means is that we are more likely to buy a product that doesn't do what we need it to do, or which is going to be out of date, obsolete or just plain irrelevant well before it has paid for itself, or we have paid for it. Which makes it even more important that we buy cameras that will do exactly the job we need them to do. This will take a lot of research, and that research needs to be up to date. Look at the trends. The more you understand them, the better your purchasing choices will be. You can buy a camera for now, and be almost certain that it will do the job. Or you can buy one more in line with the trends you've observed, but the risk that you'll be wrong, and that the trend will take a left turn, will be higher.

 And if you make the wrong choice, there are two things you can console yourself with. First, you probably didn't spend as much on your mistake as you would have done a few years ago. And secondly - there aren't many really bad products out there. And if you get it wrong, Ebay is your friend.

 But if you get it right: you've just bought a camera with more resolution, a better picture and a lower price than anyone would have thought possible just a couple of years ago.

 

Tags: Technology

Comments