|By Roger Strukhoff||
|January 20, 2014 09:58 AM EST||
"None of us really understands what's going on with all these numbers." Thus said David Stockman, the then-wunderkind budget director for newly elected President Ronald Reagan in 1981.
Stockman was widely ridiculed for such a rare burst of candor from a government official. He was referring to the administration's efforts to grapple with the major budget and tax reforms candidate Reagan had promised the year before.
I think it's fair enough to use these words as a basis for what's going on in the commingling worlds of Cloud Computing, Big Data, and the Internet of Things (IoT).
I've promised to write about all that's happening with IoT between now and @ThingsExpo June 10-12 in New York, an event for which I serve as Conference Chair.
Zettabytes Take the Stage
But first I have to get a grip on what's going on with all these numbers.
Let's start with a prediction by CSC, the Washington, DC-area IT services provider. I'm reading one of its infographics that alleges Big Data will cause global data storage needs to increase 44 times by 2020, reaching 35 zettabytes. (It says we had .79 zettabyte under control in 2009.)
"Only" 10.5 zettabytes of the 2020 total will be generated by enterprises, according to CSC. But thanks to the cloud, 28 zettabytes will be managed by enterprises.
Break it Down
Let's break this down by imagining a zettabyte. I, for one, am still not comfortable visualizing, abstracting, or using that term. A zettabyte is 1 billion terabytes, 1 million petabytes, or 1,000 exabytes.
Yes, so take today's typical 1 terabyte personal-computing hard drive (worth about $80) and multiply that by a billion to get a single zettabyte. Now imagine storing 28 of them.
The bandwidth requirements for this amount of data will be similarly daunting. If only 1% of that data were zipping around per second, we'll more than 2 trillion gigabit connections to make it happen.
(1% of 28 zettabytes = 280 exabytes = 280 million terabytes = 280 billion gigabytes = 2.4 trillion gigabits.)
We're going to need a bigger boat.
Many Big Datacenters
When we apply the 28-zettabyte figure to datacenters, the initial calculations are equally shocking. This is a relevant calculation in the wake of the recent news that IBM plans to build 15 new datacenters at a cost of $1.2 billion.
That's $80 million per datacenter, a modest number in the datacenter world, and one which will result in an average facility encompassing about 8,000 computers, 80,000 square feet, and perhaps 0.8 exabyte of storage.
To reach 28 zettabytes, we would need only 35,000 of these datacenters in the world. Using IBM's budget for its new datacenter initiative, total cost would come in at 35,000 x $80 million, or $2.8 trillion. If, say, one quarter of them were built in the US, we'd see one every 15 miles or so driving down any road.
Oh, now we have to add about 84,000 megawatts to the electrical grid, which shouldn't require more than around 50 large power plants, whether nuclear or natural-gas. There's also the matter of water usage for cooling, to be measured in the billions of gallons per day.
Can It Happen?
Moore's Law can be expected to work its magic between now and 2020, and the good news is that storage costs have been moving on a curve steeper than Moore's Law. So in the end, these numbers may not be so eye-poppingly large.
But it's clear the global engineering challenge (and opportunity) related to cloud computing, Big Data, and the IoT is an enormous opportunity. Let's forget for a few seconds what revenue might be generated for software and services companies. Let's forget what value might be added to national economies by new business and new productivity levels.
The US Interstate Highway system was built for $400 billion in current dollars, give or take. The global Information Superhighway (yes, let's bring back that term!) is several times larger, Moore's Law notwithstanding.
But can it happen? Do we have the societal will to build this 21st century hive intelligence?
This is where our friends the politicians must eradicate their collective Anaproctocephalogical Syndrome and do some good for humanity.
The US in particular could be - could be - a leader in open, global communications by ending its "possess the haystack to find the needle" approach to spying on everybody and their brother and your Aunt Maude. Recent remarks by President Obama give me little present hope.
Because the CC/BD/IoT challenge is as much a socio-political challenge as it is an engineering and economic challenge.
Optimism, Pessimism, or Reality?
The numbers I played with here serve as a general indicator of what it is we have, unwittingly or not, set upon with our wondrous machines. The real numbers will play out over time. In any case, we are on the cusp of transformational change.
IBM's SVP of Global Technology Services Erich Clementi (pictured), writing in his blog about the company's new datacenter initiative, touts IBM's commitment to "robust global networks of datacenters."
Clementi also enthuses, "cloud computing is a fabric that will knit the entire world closer together-businesses, economies and people. A lot of good will come of it. But, first, we have to build a robust global network of cloud data centers to turn that promise into reality."
Yes, if all this data can continue to flow among borders relatively easily and peacefully (as email and website information have for some time now), there is hope for all nations of the world to improve themselves through the transformational change wrought by mobility, sensors, and the ongoing social-media revolution.
If not, if instead national firewalls become common to keep the US government out, and we end up living on a globe of re-isolated nations, then all these numbers mean less than zero. No zettabytes for you.