Next week see's the Media Production Show make its return, and Editor in Chief David Shapton will be chairing a discussion about disruptive technology. But what form this takes precisely will be up to you!
I’m going to be chairing a discussion at the Media Production Show in London on 12th June at 1pm, called “Five Digital Disruptors”. We live in an era of disruptive technology and I don’t think I’ll have any difficulty thinking of five examples. But I don’t want to do this in isolation. I’d rather listen to what people have to say, rather than just go with my own subjective opinion.
But to get things going, here are my ideas so far.
- Exponential technology
- The Netflix effect
- AI and ML
Let’s look at these in some more detail.
It’s now pretty widely accepted that in at least some sense of the words, technology is growing in power and capability exponentially. What that means is that when there’s a new scientific or technological development, the effect of it is not additive, it’s multiplicative.
Which means that the line that depicts our progress keeps getting steeper.
The effect of this is that for each increment on the time scale - one decade, one year, one month, etc, you get more progress than the previous period. You can see this happening very clearly with camera resolutions: SD, HD, 4K, 8K - each one with at least four times the number of pixels of the previous one. (SD to HD was slightly more than that). It means that, across four generations of resolution, 8K has 85 times more pixels than SD.
When I was born, the state of the art in consumer electronics was a valve (tube) radiogram. Now it’s an iPhone with seven thousand million transistors (where a transistor is approximately equivalent to a valve).
And that’s just today, and it’s been on sale for six months. Imagine what’s coming up next.
It’s this effect that’s probably the biggest disrupter of all. Or is it?
The Netflix Effect
To the broadcast industry, the Netflix effect is the Big Bang. There is nothing more disruptive. Video on demand is an old idea, but it’s always been hampered by bandwidth. For many people, this is no longer the case. I live in the UK's seventh-biggest city and I have 300 Mbit/s broadband. That's enough for several channels of 8K. Soon it will be 1Gbit/s.
Netflix (and Amazon, and Hulu etc) are changing the way people consume media. We no longer exist on a scheduling timeline that's been imposed on us. We can binge, fast and then binge again - or set our own pattern entirely. It's cheap, the quality (technical and content-wise) is as high as you can get. So much so that Netflix is now the prime arbiter of broadcast and film standards, arguably.
Just a string of letters to most people, but actually the most significant hardware available to designers and developers right now. Just briefly: FPGAs are very fast chips whose internal layout can be determined at boot-up by software.
If you want an algorithm to run faster than it would on a general purpose processor (a CPU for example), write it so that it will run on an FPGA and it could run up to a hundred times faster. It's like being able to design your very own processor, and what's more, if you boot it up with a different program, it will become another type of processor.
Intel bought Altera, one of the two main FPGA manufacturers for $16.7 Billion in 2015. This says a lot about the perceived future importance of FPGAs. What's more, just last week Microsoft stated that it would be using general purpose FPGAs to power its cloud-based AI suite of tools, rather than fixed architecture AI chip. I think the Seattle software company's reason for this is because "fixed" chips go out of date, whereas FPGAs can be reprogrammed and reconfigured. Here's RedShark's explainer for FPGAs.
AI and ML
We could say so much or so little about this. Every time we mention it we could fill a book. Suffice to say that it's incredibly important, and at the same time, very hard to measure - and certainly to predict. My feeling is that AI and ML will turbocharge the rate of progress, but will do so in a stochastic (ie quasi random) way. Which is why I've invented the phrase "stochastic hyper-exponentiality" for the times we're living in.
5G cellular communication is different from all the previous generational leaps in mobile technology. It's different because it's up to two orders of magnitude faster, and because it will be all-embracing and all pervasive (once it's rolled out, of course). With speeds up to and over 1Gbit/s it will be feasible upload video to the cloud in real time. Ultimately in-camera storage may just be seen as a "buffer".
But it's much more than that. It's likely that 5G will become the default way for us to connect and transfer data. It will typically be faster than our own broadband connections. Before long, we will be wrapped in a wireless fabric of extremely high speed data communications and we will, shortly after that, wonder how we ever managed without it.
At my discussion I'll have Nic Hatch, CEO of Ncam and Brett Danton, Director and RedShark contributor. We're hoping for one more guest, who has yet to confirm. If you can get to London, the talk is at the Media Production Show on 12th June at 1:00
So that's it for now. What do you think? Let me know either in the comments or by writing to email@example.com