The state of global Cloud development

Statistics and projections from Cisco’s Global Cloud Index show that the world’s data centers are already processing 4.7 zettabytes (4.7 million petabytes) per year.

Cisco research says this amount will continue to grow by 23 per cent annually for the next few years.

petabyte

(Inforgraphic Source: https://visual.ly/how-much-petabyte)

If we project these numbers over the next 25 years, we find an astonishing 830 zettabytes to be processed in the year 2040. A slightly higher growth rate would put us into The Yottabyte Age by that time.

It’s thus incumbent on technology providers to develop much more efficient ways of building and operating data centers.

Data centers already account for about 2 per cent of North American electricity usage, and are impacting electricity grids throughout the world as well. Growth such as that outlined above would demand several more times electricity than provided by today’s entire global electrical grid.

This is no secret, and efficient design is a top-of-mind issue for data center designers and operators, who will be convening this week at the DCD Internet Summit in San Francisco.

The Need for Openness

I’m co-chairing a new track at this event, called StackingIT. I’ll be co-announcing a new program developed by event’s organisers and the Tau Institute, which I founded in 2011.

This program will outline the need for open technology for data centers and a community-centered method to measure it. Technologies range from operating systems and frameworks to chips to large-scale data center designs themselves.

Open development fosters innovation, increases security and support options, prevents vendor lock-in, benefits buyers, and enables market hypergrowth, in our opinion.

Hypergrowth It Is

hyper-growth

Hypergrowth is the issue here, and it’s implicit that cloud computing’s highly distributed architectures are the catalysts in this projected growth. With mobile and cloud computing in all their forms gaining enormous traction in enterprise and consumer IT, coupled with a developing Internet of Things and the (really) Big Data and analytics it spawns, the onslaught of data has only just begun.

What a pity it would be if our data centers and electrical grids are not up to the task.

A Big Switch

Our research over the past few years has shown that smart, equitable technology development is the key to equitable socio-economic development throughout the world. Despite recent gains in reducing global poverty and its ill effects, there is still a very long hill to climb.

Today, for example, the developed world represents 18 per cent of the world’s population yet consumes 47 per cent of its electricity. The developing world (not including China) represents 63 per cent of the population and 28 per cent of the electricity use. China has 19 per cent of the population and 25 per cent of electricity use.

Optimistic projections in which the developing world continues to grow more quickly than the developed world show global electricity demand more than doubling by 2040, with the developing world consuming almost half of the world’s electricity by then.

Even this optimistic projection will require many trillions of dollars of investment in the developing world, and will still bring 68 per cent of the world’s population up to a usage level of only 10-15 per cent of that of the developed world. But even this level of progress will not happen unless our technology becomes much more efficient.

We run many scenarios, with divergent results. All of them point to a critical need for enormous improvements in the efficiency of the chips we build, the software that runs on it, and the data centers that form the core of our connected world.

The only viable method to meet that need, in my opinion, is the open method. Our new program will measure openness and market leadership, with the hope that we can help the technology community continue to move the needle in the right direction.