<img src="https://certify.alexametrics.com/atrk.gif?account=43vOv1Y1Mn20Io" style="display:none" height="1" width="1" alt="">

Too much Technology?

4 minute read

RedSharkToo much technology?

We love technology at RedShark. It's changing at an ever-increasing rate But not everyone, including our own Phil Rhodes, agrees 100% with us!

I am no stranger to the concept of vorsprung durch technik, even though I have never had anything to do with Audi. In the perhaps 250 years since the beginning of the industrial revolution, technology has brought enormous benefits to humans. Modern western society relies on it heavily, and it would almost certainly be impossible for Earth to support 7 billion people, in the various styles to which they have become accustomed, without it. As such I generally react positively to technological development, especially as, since the 1950s, many of the low-hanging fruit of science, technology, engineering and medicine have been picked and the rate of genuinely new achievements in the field is shelving off alarmingly. The rise of antibiotic-resistant bacteria, and the failure of alternatives to appear, is one example of this. The fact that the next generation of spaceflight is likely to use the same 1940s propulsion technology as the previous one is another. Fixes to these and many other problems are sorely needed.

Downsides to Moore's Law

But great as Moore's Law is, and much as I appreciate the increased flexibility of being able to use After Effects on processors an order of magnitude faster than they were less than ten years ago, there are, as ever, a few downsides. Devices which would look like an alien artifact to someone from only a few decades ago are now cheap and disposable. Advances in manufacturing technology – itself enabled, arguably, by previous developments – have made highly complex objects (such as microelectronics) trivial to replicate, and advances in design technology have made the design work easier too. One problem is that the techniques intrinsic to this trivialisation of the very complex rely heavily on the mass market, so consumers, designers, and manufacturers end up mutually trapped in a laissez-faire circle of disposability, developing technologies not because they're required, but because the process of developing them requires this endless cycle. The concept of engineering development which feeds other development is not new, but the tightening of the timescale of that loop, well beyond the ability of society to explore or react to new technology, is.

The Poster Child for the Problem

The poster child for this problem is the cellphone, which has long since developed beyond the needs of simple communications. Even if we accept that it's nice for everyone to carry around a pocket-sized wireless terminal to the greatest information resource that has ever existed, well, we actually passed the discovery boundary for that some time ago, and we're still bothered by our provider every fourteen months to get a new one in a way that has, arguably, not much to do with our needs, the MTBF of the existing handset, or in fact anything other than the need for the development cycle to be maintained. It isn't difficult to relate this to film and TV work. On the consumer side, the need for manufacturers to sell more TVs couldn't be more obvious. Historically, things lasted a lot longer than the maybe-ten-years during which HD has been the thing to have. In the UK, we've only ever had three major television standards since the BBC started regular television in 1936 (405 line until 1985, 625 line monochrome, and PAL after 1962), given that HD receivers are still far from universal. Under those systems, people have only ever been required to buy new equipment twice, if you include colour, in the last many decades. HD and 4K are likely to chase each other by less than ten years for many people. In the past, backward-compatibility was all and requiring viewers to buy new equipment was anathema. The convenience of the consumer (as opposed to the convenience of the manufacturers, much as I hate to make this into a polemic against big business) was prioritised, even to the point of specifying a fractional frame rate for NTSC. Even at the time, this must have seemed a less-than-ideal engineering decision, and now it's recognised as a major blunder which haunts us to this day. And yes, that decision was made to avoid breaking backward compatibility.

The 4K Proposition

4K as a broadcast television proposition, on the other hand, was developed almost solely to create demand for new equipment. Whether we should consider it an actually useful development isn't necessarily obvious, since HD already defeats the angular resolution of most peoples' eyes at normal viewing distances. It's not necessarily the case that no creative use for 4K will ever be found, and as a mastering format, it's good to have more than the final version will show. But as a consumer phenomenon is unmistakably not a case of necessity being the mother of invention. Rather, invention is the mother of... well, commercial imperative? Either way, if 4K does succeed as a way to make people buy new TVs, there is a critical question over where manufacturers will go to find a hook on which to hang the next generation of hardware. Doubling resolution again is of such scant use that even the most vocal supporters of 4K seem dubious. 3D doesn't seem to have gone that well. Frame rates beyond 60 offer diminishing returns even for ardent sports viewers. On this basis, it seems that the commercially-led model of technological development may have a built-
in dead end. Nobody, for instance, tries to make money out of the further technologisation of books, and DAB radio is still struggling for acceptance.

Camera Change for Change's Sake

And so to cameras. If people are going to want 4K TVs, there will obviously be a need to shoot 4K images, or at least images we can claim are 4K and have it stand up in court. My concern is less about 4K in particular – as I say, having more resolution than you really need can only be a good thing, and it defuses criticism from the world's remaining 35mm die-hards that digital cinematography lacks sufficient resolution (even at HD, it really doesn't). There is, obviously, nothing wrong with more performance. There is something wrong with change for change's sake, or for money's sake, at least when the person inconvenienced isn't the one who'll profit. I care about and like technology inasmuch as it makes camerawork better, but I also care about and like photography as an independent subject, and at the moment there is a tendency for people who should know better (such as directors of photography) and people who should be better advised (such as producers) to chase specifications rather than art.

To an extent it's an experience thing, but anyone who's used more than – let's say - three different cameras will have begun to realise that things don't start looking markedly different, even between comparatively low-end and much higher-end models. But more than that, the constant changes to camera equipment do not automatically gift any production better lighting, composition or production design. Instead, there's a risk that key creative people are faced with a constant, and perhaps rather needless, cycle of learning new technology, which acts as a distraction from improving more general, and more important, technique. As long as it helps, there's no problem with learning new things, but given that we now clearly have adequate replacements for 35mm, it would be nice if things slowed down a little.

Enough Technology

For decades, knowing 35mm and lighting was enough. Without wanting to seem ungrateful for the benefits technology has brought us, I'd like to propose that current technology is probably good enough for the next several decades, too, and we don't all need two new cameras a year.

 

Tags: Technology

Comments