SYS-CON MEDIA Authors: Jason Bloomberg, Eric Brown, Bob Gourley, Sandi Mappic, RealWire News Distribution

Related Topics: Big Data Journal

Big Data Journal: Article

Big Data: Trillions of Gigabytes & Dollars

The Cloud/BigData/IoT Era is Here

Remember when John Sculley predicted a trillion-dollar convergence of consumer and enterprise technology by the year 2000? It turns out he was thinking small.

As I sit here a few months before our upcoming @CloudExpo & @ThingsExpo, to be held Nov. 4-6 at the Santa Clara Convention Center, I am reminded by almost-daily reports of how we're entering a multi-trillion-dollar technology era.

We'll have our own big convergence in November, by the way, with Big Data, SDDC, DevOps, and WebRTC all having discrete, focused, co-located events under the @CloudExpo umbrella. We're also planning a Multi-Cloud Bootcamp, Hackathon, and Shootout.

Trillions
The global economy currently sits at about $74 trillion annually, with enterprise IT and telco consuming about $3.7 trillion of that. The Cloud/BigData/IoT convergence is set to ratchet that last number up dramatically.

One report I find to be quite credible, for example, comes from CSC in Washington, DC, and predicts 20% annual growth in the amount of data being created globally. This will result in the amount of data rising from 790 exabytes in 2009 to 35,000 exabytes in 2020. That's 35 trillion gigabytes. This can also be measured as 35 zetabytes in the new coin of the realm.

CSC sees M2M (Machine-to-Machine) processes-driven by the IoT-to generate 70% of that data, but also notes that enterprise IT will need to handle 75% of all the data. That's 28 zettabytes-in a steady stream, the world will need to manage dataflows approaching 1 petabyte per second, 24 hours a day, 7 days a week, 365 days a year.

If you were capacity planning for this amount, and assumed you need to factor in a 10X allowance for data spikes on any individual connection, you would then need to deploy 8 billion 10-megabit connections. This number does not account for multiple uses of any files-and one can assume that YouTube or Netflix, for example, will see more than a single use of most of their files.

So, the challenge is enormous. Big Data is here, and here to stay. It will be driven by the IoT, and will not be possible without widely dispersed cloud computing.

Attendees at @CloudExpo in November get this, of course, and will also get a lot of benefit by networking with their peers, teaching, listening, and learning.

In the big picture, how will all this affect the global economy? By how many trillions will that current $3.7 trilliion rise?

Contact Me on Twitter

More Stories By Roger Strukhoff

Roger Strukhoff (@IoT2040) is Executive Director of the Tau Institute for Global ICT Research, with offices in Illinois and Manila. He is Conference Chair of @CloudExpo & @ThingsExpo, and Editor of SYS-CON Media's CloudComputing BigData & IoT Journals. He holds a BA from Knox College & conducted MBA studies at CSU-East Bay.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.