<img src="https://certify.alexametrics.com/atrk.gif?account=43vOv1Y1Mn20Io" style="display:none" height="1" width="1" alt="">

Personal View: Phil Rhodes has some provocative views to start 2013

5 minute read

RedSharkPhil Rhodes Predictions for 2013

To mark the occasion of the new year, our editor has asked me to come up with some predictions for the coming twelve months

This is either a well-intentioned attempt to provoke informed debate, or a carefully considered operation by the Make Phil Look Daft club, based on the distinct possibility that come next December I will be reading these words through my fingers.

Here goes:

Prediction one: the continued decline of 3D.

Oh, come now, this isn’t that contentious, surely. Work as we might in an industry that has a breathtaking tendency to ignore technical reality when it’s politically inconvenient, I still think that current stereoscopic 3D techniques are flawed to the point where they should either be replaced or simply fall out of use.

Again.

Right now, we have systems which are really just incremental improvements of the last few times this has been tried, with no advances which, in my view, break any new ground. The problems are many. There is a disparity between focus distance and convergence distance. Absolute convergence is still controlled by screen size, and modulated by the distance of the viewer from the screen. Viewers must sit with their eyes precisely level, or else be forced into a vertical convergence offset which is almost immediately painful. Human stereoscopy isn’t really effective beyond 20 or 30 feet in most people – the changes in convergence angle become so subtle we’re hardly aware of them -  and cinema screens are usually at least that far away, so the entire effect is being forced into a situation which doesn’t naturally warrant it.

All of these things are fundamental differences between the way we see real 3D objects and the way stereoscopic filmmaking works. These problems aren’t much discussed, not because they aren’t important, but because current techniques have, and can have, no solutions to them.

But that’s not why I think it’ll gently fade away again. That’ll happen because, to be even vaguely watchable, current 3D techniques need to be so subtle that the effect really isn’t that useful as a technique for directors, if ever it was. When it’s that subtle, when it adds that little, it’s not worth paying for. And I don’t think people will continue to be willing to pay extra for it, and when that happens, there’s not a lot of point in going through the rigmarole of shooting it.

I hope that all of the people currently making a living in 3D can continue to do so, making ride films and short subjects where the technique is easier to tolerate and has more of a purpose. But for serious narrative filmmaking?

Prediction two: cameras will continue to matter less

The performance of the most recent cameras is more an expression of the state of the art in semiconductor manufacturing than it is to do with any particular inspiration in camera engineering. Cameras such as C500, F55 and Alexa have, or soon will have, extremely comparable performance and in the main could do more or less the same set of jobs. This remains true all the way down the cost scale, down to cameras which cost as much as one flash card for an Epic, but even cost is far less of a leveller than it once was. The performance of very cheap modern cameras is now so incredibly good that a comparison with the very best equipment of a decade ago is palpably embarrassing, but even more, comparison on the big screen between cheap and expensive gear is getting closer too. The difference, primarily in sharpness and compression artifacts, between a Canon 5D mark 2 and an F65 is extremely obvious side by side. The lack of latitude, in its truest meaning, can be crippling on lower-end gear, since it makes mistakes or less-than-ideal setups less easy to conceal. But crucially, the 5D is more than good enough that features and TV shows can be and have been shot on them without it being a significant distraction to the audience.

And this is new. Until recently – let’s say until the post-HDV revolution in flash recording systems – the cost of tape decks made video cameras expensive. With fast flash now trivially inexpensive and full HD CMOS sensors likewise, it’s now possible for more or less everyone to get good recordings of good pictures. Only in niches like high speed are things still pricey.

And this is great. The sooner the world of filmmaking – particularly the young, bright eyed world of filmmaking – realises that it isn’t about the toys, the better. If we can stop worrying about cameras and just go shoot, we’ll all get a lot better a lot faster. In fact, what are you even doing reading this? Go and organise a short film!

Prediction three: Traditional broadcasters matter even less

The catastrophic, advertising-led race to the bottom in which a lot of the world’s media seems to be so enthusiastically engaged is making traditional broadcast and network television worse, very quickly. Simultaneously, in a few years YouTube has gone from being a cute sideline to a major international media outlet, despite its variable quality, and an outlet which liberates a lot of money and potentially makes it available to the film and TV industry.  I’ve been saying for years that the internet is the broadest of all broadcast mediums. Now, full budget series like the one Microsoft produced for the most recent Halo instalment, alongside the low-budget indie fun and games of guys like Freddie Wong, alongside pictures of kittens falling over, are all coming down the same pipe.

I’m not uncritical of this situation. Words like “democratisation” are bandied about as if there is never any need for a gatekeeper of any sort, anywhere, ever, and I think the average quality of YouTube content is a reflection of this problem. I think it’s actually more telling that what we might call the proper film industry – to wit, things like that Halo series – are starting to use them. Indies were always going to use it. That’s straightforward, and there’s nothing wrong with it. But the whole thing is greatly legitimised when people are willing to produce new material for YouTube in exactly the same way, with exactly the same sort of investment, that they’d have made content for Fox or the BBC. That’s relatively recent, and I think YouTube premieres of new television will become normal over the next year or two.

Prediction four: Strong Arm tactics

Google and Samsung's release of the first Chromebook laptop with an ARM processor earlier this year, there was very little fanfare (Except this article in RedShark - Ed). In the future, I suspect this may well be remembered as a turning point – the first time anyone tried to sell a mass market laptop that didn't have a processor compatible with Intel's x86 instruction set.

The merits of the Chromebook aside, the rise of microprocessor design firm ARM has recently been meteoric, perhaps driven by the explosive growth of the smartphone market. Competition for Intel's crown has long been fierce, with both Cyrix and AMD building directly competing products, and companies like IBM (with PowerPC) and Motorola (with the 68000 series) offering alternatives which have all brushed with greatness as desktop and workstation processors during the last three decades. What's interesting about Arm is that their designs have a ratio of power consumption to performance that Intel's architecture may never be able to match, even though Intel has an absolutely huge amount of semiconductor manufacturing expertise.

What this throws into sharp relief is that people mainly use Intel's processors is because all of the world's most important software runs on them. This almost invariably means the Windows operating system, which is also used because most of the world's most important applications run under it. And with windows 8, Microsoft announced a version compiled for ARM.

Right now, ARM don't have processors with the performance to replace the high end of Intel's workstation and server CPUs, regardless of the software situation. But they're improving rapidly, and the lower power consumption of their designs suggests that high clock speeds and multiple cores are really just a matter of manufacturing technology. That's still no small ask, but If I were Intel, and I knew that my usefulness was based largely on an instruction set that is difficult to build into the tiny, low-power devices that are currently popular, I might be quite concerned.

Tags: Technology

Comments