Public cloud scalability has been increasing since its inception. But can this go on forever? Or is there a limit to be reached? Discover more in our article.
One of the many remarkable things about cloud computing is that as the infrastructure gets bigger and the coding more complex, the solutions only get faster. Somehow, advances in computing have meant that your cloud provider combines the speed of a cheetah with the size of a blue whale.
It's an exciting time for the digital world, whether as a consumer, front-row observer, salesperson or developer. We're as guilty as the next firm of sounding breathlessly excited about its potential to transform business – and we make no apologies for it.
But is there a limit to how far we can scale up public cloud provision? As corporate data grows exponentially, year-on-year, are we going to reach a point where there's just no room left to store it?
Physical constraints
What we know as the cloud today was brought to life in the early 2000s by the Amazon Web Services (AWS) public cloud. At this point, AWS stood alone in an empty field, brandishing a powerful new tool.
Between 2005 and 2011, infrastructure began to be centralised in data centres. These became the powerhouses of scalability – facilities that hosted increasingly large amounts of compute and storage.
The rest of the decade saw the development of cloud-based software stacks, cloud storage as a service (think Dropbox), multi-region availability and service level agreements (SLAs).
What made this decade of progress possible? Lots of things, of course – but one that doesn't always get a shout-out is physical land.
Yes, the cloud's ethereal nature can lead us to forget that it's possibly only because of data centres – huge bits of engineering that the average person rarely interacts with. Corporate hyperscale facilities tend to be built outside of urban areas, hidden from view like the factories that provide consumer goods to the West.
Sure, we can keep on building – that's what we're doing, in fact. 2023 saw new data centres being built worldwide. But one thing is getting in the way: power availability.
You see, data centres are greedy guzzlers. They need to be constantly pumped with electricity. Eco-friendly versions exist or are in development but these aren't yet scalable. The reality is that most data centres need lots and lots of power – and there's a
worldwide shortage
.
It's always too soon to predict the future. But for now, it seems unlikely that data centres will continue to be built at the same breakneck speed as the last couple of decades. This could, in theory, put the brakes on public cloud scalability.
Can Moore's Law guide us?
It's not just security and compliance laws that enterprises follow. There's also Moore's Law – the formulation of Intel co-founder Gordon Moore.
Forty years ago, Moore predicted that transistor manufacturers could double computing speed every two years – forever.
And his calculation works, pretty much. In
2009
, the fastest processor on the market outperformed its turn-of-the-century equivalent by a factor of about 30.
But if Moore's Law kept delivering, we'd be looking at pretty much infinite computing power in the next 60 years. Is that possible? Two physicists at Boston University think not.
"No system," says Lev Levitin, "can overcome that limit [of infinite computing]. It doesn't depend on the physical nature of the system or how it's implemented, what algorithm you use for computation… This bound poses an absolute law of nature, just like the speed of light."
So even if, as we suspect, computing power will continue to increase in the next decades, that doesn't mean that it will go on forever. A consequence of this is that public cloud scalability – which has so far been following a clean upward trajectory – could start to waver.
What's next?
The cloud isn't going anywhere – and the amount of data it stores continues to increase at a mind-boggling rate.
In 2024, says
Deloitte
, "the world is forecast to generate 149 zettabytes of data."
"A zettabyte," they go on, "is a thousand, thousand, thousand, thousand gigabytes. If each byte were a grain of sand, there would be enough to fill every beach on earth almost 20,000 times."
We're not yet at the upper limit of scalability. It seems highly likely that the coming years will see a continuing spike in storage capacity and data centre infrastructure.
The coming years are also likely to see the growth of emerging technologies that are, as yet, not nearly as scalable as SaaS and IaaS solutions. These include blockchain, the Internet of Things, artificial intelligence and green computing.
Data centres are now essential for the smooth running of many businesses. But no one would deny that they're environmentally unfriendly – not least because of the amount of power they consume.
Advances towards a greener way of working have already been made. Take Microsoft's
Project Natick
– a subsea data centre that doesn't need energy-hungry cooling equipment, cooled as it is by the sea itself.
It's unclear how scalable these green solutions will be in the short term. If, for example, climate legislation becomes stricter across the world, we could see a limit to data centre construction – and a consequent slowdown in scalability.
Whatever happens, we're likely to see even more powerful computers and a spread of technologies that will make today's offerings look like baby steps. However things pan out, it's going to be interesting.
Final thoughts
It seems highly likely that the upward trajectory in public cloud scalability will continue as computers get even faster and data centres continue to be built.
However, there are technological and social factors that could slow this progress down. And even if they don't, there's still, so far as we know, an upper limit to computing power. Infinity isn't within our grasp and probably never will be.
Are you looking for help with a
cloud migration
? At Ascend Cloud Solutions, we've managed over 400 migrations and counting. Let's talk about your requirements.
Get in touch
today for a no-obligation consultation.