<img src="https://certify.alexametrics.com/atrk.gif?account=43vOv1Y1Mn20Io" style="display:none" height="1" width="1" alt="">

How big will storage get - and how far will it go?

Pic:
4 minute read
Pic: Shutterstock

David Shapton on the journey storage technology has undertaken in recent decades and where it might be heading next.

It's often tempting to oversimplify any storage analysis to the point of merely asking, "what size of bucket do you need". But anyone involved in video production knows that it's a lot more complicated than that. Speed, cost, bandwidth, security and safety all matter, and finding the best storage for a job will most likely involve juggling all of those, prioritising some and compromising on others.

The amount of storage in the universe is probably very close to infinite, but how do we access it? For example, if every single atom could potentially store data, that's impressive: but what does that mean in practice?

To understand what the ultimate storage might be like, it's worth spending a minute looking at the nature of progress itself and what we can learn from it. Take video resolutions, for example.

Over a very short timescale - a mere twenty years or so - video resolutions have mushroomed from Standard Definition to 8K and even 12K in the case of Blackmagic cameras. It's a great illustration of exponential growth. HD is around five times the amount of data in SD. 4K is four times HD. 8K is four times 4K, and so on. Do the sums, and you will find that 8K requires at least 80 times more data than SD. You literally need 80 times the space to store 8K. The amount of data generated by 12K is the equivalent of an 85-megapixel still picture multiplied by the frame rate per second. Luckily, clever compression mitigates this, but the trend is upward, whichever way you look at it.

Will it be upward forever? Not as part of the same progression. Something may take over from "conventional" digital video that will call for even greater data storage, but it will be because of some additional dimension beyond 2D video. Look up "metaverse" for possible outcomes.

But, for several reasons, "conventional" video is unlikely to go much beyond today's state of the art. You probably wouldn't notice the difference if it did. At the point where your TV screen is the size of your living room wall, you almost certainly don't need more than 12K. But there's a more significant, general point here. Once video resolutions get high enough, there's nothing left for even higher pixel densities to fix. Specifically: digital artefacts that were objectionable in early incarnations of digital video effectively disappear with a sufficiently high resolution. In truth, they don't go away: it's just that they're too small or too insignificant to notice.

So it might turn out that for most practical purposes, there's no need for a nearly infinite supply of storage. What matters more is that it's fast (enough), secure and reliable. Modern storage has evolved into an almost bewildering variety of "species", so every niche has its own elegant set of options.

But that's only one side of the story.

The constant need for more

The other argument is that everything we do produces data, and as our lived experiences become more digital, that amount will grow and keep growing.

Not all data is digital. We live in an organic, analogue world where everything we do leaves a trail. For example, you can learn a lot about a person by looking at their living space. Is it tidy? Are things obsessively arranged? Or is it more chaotic or perhaps minimalist? Viewing the same habitat over time will reveal more information about trends and habits. It's possible that everything we do - even our own speech - is somehow "recorded" in our surroundings. So how do we decode this information? Right now, we can't, but as processing power and software sophistication continue to grow, we might find that one day, we can "playback" our previous lives from the sofa in our living rooms or the wallpaper in the bathroom.

Today, the cloud offers almost infinite storage. An individual will never run out of cloud storage. The cloud - which is not actually a cloud but a worldwide network of air-conditioned data centres - will continue to be built at pace to meet growing data demands. The only limit for users is not the space but the access cost. This will, almost inevitably, get cheaper per unit of storage.

As the metaverse matures and we spend more of our lives interacting with each other in the digital domain, storage will need to grow to accommodate our new mode of existence. What are the limits to this? Very few, apart from one, perhaps surprising, gigantic limiting factor: noise.

The opposite of information is randomness. A purely random signal or data set contains no information. If it did, it wouldn't be random. Between data and randomness is a space where it takes processing - real-time mathematics - to separate the signal from the noise. Remember that digital storage relies on analogue phenomena: magnetic fields, optical reflectivity, and voltages. The lowest layers of digital storage are resolutely analogue, and the smaller they get, the harder it is to extract reliable information.

The potential futures of storage

Digital processing has evolved over the decades and can now extract extraordinary amounts of data from media that would otherwise be too noisy; too indistinct. But in the future, it will have to deal with another phenomenon: quantum effects.

In theory, single atoms could be employed to store data. That's certainly intriguing, not least because it promises the potential to keep all the digital data that's ever been generated in the space of (insert some arbitrarily small object like a sugar cube). But there's a problem: atoms don't behave like everyday objects. The familiar laws of physics don't apply at that scale. Before you reach the scale of atoms and molecules, so-called quantum effects influence behaviour. Existing chipmakers are starting to see quantum effects as they bring on-chip interconnections down to only a few atoms wide.

Maybe quantum computers will prove to be adept at solving quantum problems. Until then, molecular-scale storage seems a way off. But don't dismiss it altogether because impossible tends to become possible unexpectedly these days.

Meanwhile, what about going into space? What if we could put storage out there? Maybe we could deposit in one of those places, called Lagrange points, where gravity from opposing bodies cancels out. Perhaps the James Webb Telescope needs company!

Only two problems here. First: it's expensive to send hardware into space. That's likely to be the dominant factor in service costs. Second: latency. Space is big enough to make the speed of light seem inadequate. Simply put, the time it takes to retrieve data from an ultra-remote drive will likely make it unusable for most purposes, except for deep archive, where retrieval speed is not an issue.

Over the last 60 years, storage has expanded, luckily, to meet the demands of our ever-increasing capacity to create and consume data. Today, more than ever, it's hard to say whether that will continue. If there is a hard limit on how small we can make things components and interconnects, then the only way to continue is to make larger, more parallel devices or to use distributed networked storage. Ultimately, there's a limit to that, too. But with multiple new technologies poised to help out over the next decade, it would be surprising if we don't see a continuation of the current trends towards higher capacity, if not an acceleration.

Tags: Storage

Comments