SYS-CON MEDIA Authors: Yeshim Deniz, Elizabeth White, Pat Romanski, Liz McMillan, William Schmarzo

Blog Feed Post

Intercloud: The Evolution of Global Application Delivery

The concept of an “intercloud” is floating around the tubes and starting to gather some attention. According to Greg Ness you can “Think of the intercloud as an elastic mesh of on demand processing power deployed across multiple data centers. The payoff is massive scale, efficiency and flexibility.”

Basically, the intercloud is the natural evolution of global application delivery. The intercloud is about delivering applications (services) from one of many locations based on a variety of parameters that will be, one assumes, user/organization defined. Some of those parameters could be traditional ones: application availability, performance, or user-location. Others could be more business-focused and based on such tangibles as cost of processing.

Greg, playing off Hoff, explains:

For example, services could be delivered from one location over another because of short term differentials in power and/or labor costs. It would also give enterprises more viable options for dealing with localized tax or regulatory changes.

The intercloud doesn’t yet exist, however---It has at least one missing piece: the automation of manual tasks at the core of the network. The intercloud requires automation of network services, the arcane collection of manual processes required today to keep networks and applications available.

Until there is network service automation, all intercloud bets are off.

What I find eminently exciting about the intercloud concept is that it requires a level of intelligence, of contextual awareness, that is the contextpurview of application delivery. We’re calling them services again, like we did when SOA was all the rage, but in the end even a service can be considered an application – it’s a self-contained piece of code that executes a specific function for a specific business purpose. If it makes it  easier to grab onto, just call “application delivery” “service delivery” because there really isn’t too much of a difference there. But intercloud requires a bit more awareness than global application delivery; specifically it requires more business and data center specific awareness than we have available.

On the surface intercloud sounds a lot like what we do today in a globally load balanced environment: application services are delivered from the data center that makes the most sense based on variables (context) surrounding the request including the user, the state of the data center, the networks involved, and the applications themselves. Global application delivery decisions are often made based on availability or location, but when the global application delivery infrastructure is able to collaborate with the local application delivery infrastructure the decision making process is able to get a lot more granular. Application performance, network conditions, capacity – all can be considered as part of the decision regarding which data center should service any given request.

I rarely disagree with Greg and, on the surface at least, he is absolutely right in that we need to automate processes before the intercloud can come to fruition. But we are also missing one other piece: the variables that are peculiar to the business and data centers comprising the intercloud and the integration/automation that will allow global application delivery infrastructure to take advantage of those variables in an efficient manner. That data, likely, is assumed in the need to automate as without that data there’s not nearly enough to automate decisions across data centers in the way in which Greg and Hoff expect such systems to do.


WHAT’S DIFFERENT ABOUT INTERCLOUD?
What makes the intercloud differ from today’s global application delivery architectures is the ability to base the data-center decision on intercloud businessy-type (non IT) data. This data is necessary to construct the appropriate rules against which request decision making processes can be evaluated. While global application delivery systems today are capable of understanding a great many variables, there are a few more nascent data points it doesn’t have such as cost to serve up an application (service) or labor costs or a combination of time of day and any other variable.

Don’t get me wrong – an intelligent global application delivery system can be configured with such information today, but it’s a manual process and manual processes don’t scale well. This is why Greg insists (correctly) that automation is the key to the intercloud. Assuming that the cost of power, for example, changes throughout the day and, in fact, might be volatile in general means that manually reconfiguring the global application delivery system would be necessary. That simply wouldn’t be feasible. A system for providing that information – and any other information which would become the basis for request routing across distributed data centers – needs to be constructed and subsequently able to be integrated into the massive management system that will drive the intercloud.

It makes a certain amount of sense, if you think about it, that global application delivery would also need to evolve into something more; capable of context awareness at a higher point of view than local application delivery. Global application delivery will be the foundation for intercloud because it’s already performing the basic function – we just lack the variables and the automation necessary for global application delivery solutions to take the next step and become intercloud controllers.

But they will get there.

Follow me on Twitter View Lori's profile on SlideShare friendfeedicon_facebook AddThis Feed Button Bookmark and Share

Related blogs & articles:

Read the original blog entry...

More Stories By Lori MacVittie

Lori MacVittie is responsible for education and evangelism of application services available across F5’s entire product suite. Her role includes authorship of technical materials and participation in a number of community-based forums and industry standards organizations, among other efforts. MacVittie has extensive programming experience as an application architect, as well as network and systems development and administration expertise. Prior to joining F5, MacVittie was an award-winning Senior Technology Editor at Network Computing Magazine, where she conducted product research and evaluation focused on integration with application and network architectures, and authored articles on a variety of topics aimed at IT professionals. Her most recent area of focus included SOA-related products and architectures. She holds a B.S. in Information and Computing Science from the University of Wisconsin at Green Bay, and an M.S. in Computer Science from Nova Southeastern University.

Latest Stories
Doug was appointed CEO of Big Switch in 2013 to lead the company on its mission to provide modern cloud and data center networking solutions capable of disrupting the stronghold by legacy vendors. Under his guidance, Big Switch has experienced 30+% average QoQ growth for the last 16 quarters; more than quadrupled headcount; successfully shifted to a software-only and subscription-based recurring revenue model; solidified key partnerships with Accton/Edgecore, Dell EMC, HPE, Nutanix, RedHat and V...
Having been in the web hosting industry since 2002, dhosting has gained a great deal of experience while working on a wide range of projects. This experience has enabled the company to develop our amazing new product, which they are now excited to present! Among dHosting's greatest achievements, they can include the development of their own hosting panel, the building of their fully redundant server system, and the creation of dhHosting's unique product, Dynamic Edge.
Digital transformation is about embracing digital technologies into a company's culture to better connect with its customers, automate processes, create better tools, enter new markets, etc. Such a transformation requires continuous orchestration across teams and an environment based on open collaboration and daily experiments. In his session at 21st Cloud Expo, Alex Casalboni, Technical (Cloud) Evangelist at Cloud Academy, explored and discussed the most urgent unsolved challenges to achieve fu...
PCCW Global is a leading telecommunications provider, offering the latest voice and data solutions to multi-national enterprises and communication service providers. Our truly global coverage combined with local, on the ground knowledge has helped us build best in class connections across the globe; and especially in some of the remotest, hard-to-reach areas in exciting growth markets across Asia, Africa, Latin America and the Middle East.
NanoVMs is the only production ready unikernel infrastructure solution on the market today. Unikernels prevent server intrusions by isolating applications to one virtual machine with no users, no shells and no way to run other programs on them. Unikernels run faster and are lighter than even docker containers.
Public clouds dominate IT conversations but the next phase of cloud evolutions are "multi" hybrid cloud environments. The winners in the cloud services industry will be those organizations that understand how to leverage these technologies as complete service solutions for specific customer verticals. In turn, both business and IT actors throughout the enterprise will need to increase their engagement with multi-cloud deployments today while planning a technology strategy that will constitute a ...
CloudEXPO | DevOpsSUMMIT | DXWorldEXPO Silicon Valley 2019 will cover all of these tools, with the most comprehensive program and with 222 rockstar speakers throughout our industry presenting 22 Keynotes and General Sessions, 250 Breakout Sessions along 10 Tracks, as well as our signature Power Panels. Our Expo Floor will bring together the leading global 200 companies throughout the world of Cloud Computing, DevOps, IoT, Smart Cities, FinTech, Digital Transformation, and all they entail. As ...
Darktrace is the world's leading AI company for cyber security. Created by mathematicians from the University of Cambridge, Darktrace's Enterprise Immune System is the first non-consumer application of machine learning to work at scale, across all network types, from physical, virtualized, and cloud, through to IoT and industrial control systems. Installed as a self-configuring cyber defense platform, Darktrace continuously learns what is ‘normal' for all devices and users, updating its understa...
CloudEXPO | DevOpsSUMMIT | DXWorldEXPO Silicon Valley 2019 will cover all of these tools, with the most comprehensive program and with 222 rockstar speakers throughout our industry presenting 22 Keynotes and General Sessions, 250 Breakout Sessions along 10 Tracks, as well as our signature Power Panels. Our Expo Floor will bring together the leading global 200 companies throughout the world of Cloud Computing, DevOps, IoT, Smart Cities, FinTech, Digital Transformation, and all they entail.
SUSE is a German-based, multinational, open-source software company that develops and sells Linux products to business customers. Founded in 1992, it was the first company to market Linux for the enterprise. Founded in 1992, SUSE is the world’s first provider of an Enterprise Linux distribution. Today, thousands of businesses worldwide rely on SUSE for their mission-critical computing and IT management needs.
Digital Transformation (DX) is a major focus with the introduction of DXWorldEXPO within the program. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive over the long term. A total of 88% of Fortune 500 companies from a generation ago are now out of business. Only 12% still survive. Similar percentages are found throughout enterprises of all sizes. We are offering early bird savings...
In his keynote at 19th Cloud Expo, Sheng Liang, co-founder and CEO of Rancher Labs, discussed the technological advances and new business opportunities created by the rapid adoption of containers. With the success of Amazon Web Services (AWS) and various open source technologies used to build private clouds, cloud computing has become an essential component of IT strategy. However, users continue to face challenges in implementing clouds, as older technologies evolve and newer ones like Docker c...
Only Adobe gives everyone - from emerging artists to global brands - everything they need to design and deliver exceptional digital experiences. Adobe Systems Incorporated develops, markets, and supports computer software products and technologies. The Company's products allow users to express and use information across all print and electronic media. The Company's Digital Media segment provides tools and solutions that enable individuals, small and medium businesses and enterprises to cre...
Hackers took three days to identify and exploit a known vulnerability in Equifax’s web applications. I will share new data that reveals why three days (at most) is the new normal for DevSecOps teams to move new business /security requirements from design into production. This session aims to enlighten DevOps teams, security and development professionals by sharing results from the 4th annual State of the Software Supply Chain Report -- a blend of public and proprietary data with expert researc...
Your job is mostly boring. Many of the IT operations tasks you perform on a day-to-day basis are repetitive and dull. Utilizing automation can improve your work life, automating away the drudgery and embracing the passion for technology that got you started in the first place. In this presentation, I'll talk about what automation is, and how to approach implementing it in the context of IT Operations. Ned will discuss keys to success in the long term and include practical real-world examples. Ge...