What does the future hold for data storage?

Data storage, data storage, data storage. Why does it seem like all of a sudden everyone is talking about data storage? When did it become so important or perhaps a better question is… why is it suddenly so important?

Data storage

Data storage, data storage, data storage. Why does it seem like all of a sudden everyone is talking about data storage? When did it become so important or perhaps a better question is… why is it suddenly so important?

The quick answer is of course… the advent of the internet. The sheer amount of data generated in recent years is simply enormous!

It’s estimated 3.7 billion people use the internet daily, generating between them, over 2.5 quintillion bytes of data daily and by 2025 we’ll be collectively storing over 160 zettabytes of data across the planet.

If you’re not sure how big a zettabyte is, it’s 1,000,000,000,000,000,000,000, bytes of data.

Now, we’re not suggesting your organization needs a plan to handle that much data (hopefully… although we’d be happy to help if it does) but it does highlight just how much data is being constantly generated.

Some of that data is likely useful to your organization, some of it isn’t. Some of it you’ll want to keep, some of it you won’t.

If you decide to keep it though you’re going to need to store it somewhere.

But that’s fine right?

Well, maybe not. It might surprise you to know that a data storage crisis is coming and if we keep generating data at current rates, we’ll not have anywhere to store it.

Storing that much data, in the same format we currently do, will just be too expensive or resource intensive to accomplish in any meaningful fashion.

Of all energy consumed by technology on our planet, around 20% of it currently goes to powering data centers around the globe.

A single data center can use more energy in a day than a first world town… and that’s clearly not sustainable just to store 112 photos of a random night out someone had six years ago (we’re looking at you Mr Zuckerberg!).

What’s the alternative though?

Do we even have an alternative? Well actually yes… some very smart people are working on solutions to this problem even as you read this.

Now obviously trying to guess the future is never going to be an exact science but that doesn’t mean we can’t look at certain sweeping trends affecting data centers and the data sector as a whole to make some educated guesses as to the future of data storage.

What’s The Current State Of Data Storage Technology?

Before we take a deep dive into upcoming trends of data storage, it might help to take a look into the current state of play to see how data is currently being stored.

There’s four very distinct, but all equally important factors currently driving the evolution of data storage.

  • Cost
  • Capacity
  • Interface speeds
  • Density

Any current (or future) solution seeks to achieve all four with varying degrees of success.

HDD’s (Hard Disk Drives)

Let’s be honest, HDD’s aren’t going anywhere anytime soon.

According to the most recent projections, around 54% of all data will be stored on HDD by 2024.

That’s a big drop from its 65% market share in 2019 but, when considered as part of a rebalancing from the ever growing amount of created data, actually makes a lot more sense.

However, that being said, a lot of experts now feel flash may well end HDD’s dominance, at least in smaller scale data centers, switching instead to a hybrid /flash / cloud model so we’ll have to see what the future holds for the humble hard disk drive.

SDD’s (Solid State Drives)

The SSD storage sector is likely to keep growing steadily as the tech behind their interfaces continues to improve.

Currently quad level cells (QLC’s) are seeing widespread adoption as another tier in the data storage hierarchy, made popular as they’re offering more capacity at a much lower cost but it’s likely that QLC tech will start giving way soon to PLC (penta‑level cells), likely over the next few years.

Multi‑Cloud Storage

Many data centers are already seeing that their clients are becoming more and more wary over holding all of their data in one place.

Just as you shouldn’t ever place all your eggs in one basket, many, worried about data redundancy, backups and disaster recovery are looking for multi‑cloud solutions.

It’s good sense when making a back‑up for your data to store it on separate systems, isolated as much as possible, something which multi‑cloud solutions are currently doing extremely well.

The Future Of Data Storage

So, what does the future of data storage look like then?

Unless a trend is started to dispose of redundant data (something that’s incredibly unlikely to happen) something is going to need to give.

The alternative needed then is to discover more efficient types of storage.

Storage that can handle exponentially more data, at much lower costs, whilst still giving users almost instantaneous access to it.

Fortunately, there are already people investigating these methods and, whilst they’re not quite ready yet, they may well be one day soon!

Cold Storage

Cold storage, the theory of storing data at super (sub‑zero) cold temperatures, is currently being investigated as an interesting storage possibility.

Researchers at the University of Manchester have been developing molecules capable of storing data at several hundred times the capacity that a traditional HDD can.

As you can probably guess from the heading of this paragraph, these molecules need to be stored at very, very low temperatures.

Now, storing data at low temperatures isn’t a new idea, the science behind that has been understood for a while. What’s new about this research however is in how low the temperatures are getting thanks to the hardiness of these new molecules.

They’re capable of storing data at 80 kelvins, with the use of liquid nitrogen meaning more data can be stored without running into heat displacement issues and, as Earth’s atmosphere is around 78% nitrogen, it’s also a relatively cheap solution as well.

5D Optical Storage

Rather than cold storage, researchers at the University of Southampton have been exploring methods of storing data by etching it into cubes of silicon with advanced lasers.

This new technique is being referred to as 5 °F as, in addition to the normal three axis being used as storage parameters, the researchers are also able to use the size of a recorded record and its orientation.

Picture it as a kind of 3D version of a CD, with devices reading the data directly from the cube.

The pro’s of this method are that anything stored in this manner can be done so for a very long time, with anything being etched becoming almost permanent.

Once the data has been stored, it requires no power to keep it (only read it) and the glass cubes being created are virtually indestructible, with one small cube capable of holding hundreds of terabytes of data.

Quantum Storage

We’d be remiss in discussing the future of data storage if we didn’t at least touch upon quantum computing and quantum storage.

In the world of computing, data and the world at large, there’s been a lot of talk about the potential of quantum computing.

Qubits in quantum data storage (as they’re called) are capable of holding an exponential number of states using superposition and entanglement.

N‑qubits are comparable to 2n bits or… to put that in a way a little more easily understood for anyone not familiar with quantum computing jargon… 100 qubits would be more than enough to hold more states than all the HDD’s currently in existence. 300 qubits could hold more states than the number of atoms in the universe.

We’re a long way yet though of solving the looming data crisis with quantum computing unfortunately.

Quantum computing’s main problem is that the amount of data retrievable from an n‑qubits can never be larger than the amount that could be retrieved from n‑bits, meaning that should data be stored at a quantum level, most of it could never be retrieved, a problem that’ not yet been solved.

It’s possible that this barrier could well mean that, whilst quantum computing may be the way forward, quantum storage may not be.

DNA Storage

Many scientists, over the last decade or so, have become rather enamored with the idea of using DNA as a method of data storage.

It’s not as far‑fetched an idea as you may think though.

DNA is made from base‑pair amino acids, which, if correctly modeled, can represent binary code, making it ideal for storing data, whilst the coiled structure of DNA provides for a tremendous amount of storage in a very small space.

In fact, a team at Harvard was able to accurately store the entire code for a video on a single strand of bacterial DNA.

Since then, several organizations have been working on perfecting and then commercializing this technology.

Current thinking is that synthetic DNA (rather than living tissue) could be the answer, with data being written to a molecule as it’s assembled.

As with etched glass in the 5D optical storage we mentioned, this could well prove a long term storage solution but we’re still not quite there yet.

Hybrid Cloud Storage

Whilst hybrid cloud storage may not seem as futuristic, sci‑fi or interesting as some of the other solutions mentioned so far, it is currently sweeping the data storage sector and offers a real solution right now to many data storage issues.

Hybrid storage is, as the name suggests, an approach to data storage that utilizes both cloud and on‑prem solutions, combining the best of both into a cohesive storage architecture.

Taking this approach reduces costs for organizations by shifting some of the costs of adding and maintaining local storage into the cloud, using the cloud only as and when it’s needed.

It’s also one of the most scalable solutions currently on offer, allowing for the best of both cloud and on‑prem infrastructures.

On‑prem can be used for long‑term, rarely used data, whilst the cloud can be architected out to allow for more commonly used data or unexpected spikes in demand.

Now here comes the shameless plug (you knew it was coming!).

One of the issues often raised with hybrid storage is that, if architected out wrong it can be both expensive and slow, costing money in both on‑prem hardware and unexpected cloud costs.

However, with DoubleCloud, you can build your data analytics at SSD storage speeds whilst still maintaining S3 prices.

In fact, by using our managed ClickHouse® platform you can spend over 5x less by automatically decoupling your latest or most frequently used data directly to SSD, with the less frequently used data automatically being shunted to S3 for you.

Get started with DoubleCloud

Sign in to save this post