Click here to close now.

SYS-CON MEDIA Authors: Pat Romanski, William Schmarzo, Elizabeth White, Carmen Gonzalez, JP Morgenthal

Related Topics: SDN Journal, Containers Expo Blog

SDN Journal: Blog Feed Post

Data Center Architecture: Together and Apart

The datacenter represents a diverse set of orchestrated resources bound together by the applications they serve

The challenge in architecting, building, and managing data centers is one of balance. There are forces competing to both push together and pull apart datacenter resources. Finding an equilibrium point that is technological sustainable, operationally viable, and business friendly is challenging. The result is frequently a set of compromises that outweigh the advantages.

Logically together

The datacenter represents a diverse set of orchestrated resources bound together by the applications they serve. At its most simplest, these resources are physically co-located. At its extreme, these resources are geographically distributed across many sites. Whatever the physical layout, these resources are under pressure to be treated as a single logical group.

Resource collaboration - The datacenter is a collection of compute and storage resources that must work in concert in support of application workloads. The simple requirement of coordination creates an inward force pulling resources closer together, even if only logically. How can multiple elements work together towards a common goal if they are completely separate?

The answer is that they cannot. And as IT moves increasingly towards distributed applications, the interdependence between resources only grows.

Interestingly, the performance advantages of distributed architectures are only meaningful when communication between servers is uninhibited. If the network that makes communication possible slows down, the efficacy of the distributed architecture decreases. This means that datacenter architects must solve simultaneously for compute and storage demand, and the interconnect capacity required between them.

Resource availability - Building out a datacenter is an exercise in matching resource capacity to demand. But not just in aggregate.

Individual applications, tenants, and geographies all place localized demands on datacenter resources. If the aggregate demand is sufficient but the resources exist in separate resource pools, you end up in a perpetual state of mismatch. There is always too much or too little workload capacity. The former means you have overbuilt. The latter leaves you wanting for more, which oddly enough means you end up having to overbuild.

Combatting these resource islands requires pulling resources closer together. In the most simple case, this is a physical act. But even if resources cannot be physically co-located, there are entire classes of technologies whose primary function is to allow physically separate resources to behave as if they are in close proximity.

Of course this does not come without a cost. The complexity of managing the disparate technologies required to logically pool physically separate resources can be prohibitively difficult. Even the most skilled specialists have to invest time in creating a properly engineered fabric between sites that accounts for queuing, prioritization, load balancing, and so on. The number of protocols and technologies required is high, and the volume of devices over which they must be applied can be huge. The result is a level of complexity that makes the network more expensive to manage and more difficult to change.

Organizational process - Friction is greatest at boundaries. Whenever a task requires involvement across different organizations or teams, the act of human coordination imposes a tax on both effort and time. In larger organizations, the handoff between teams might be automated to reduce communication mistakes (as with a ticketing system), but the shift in context is still expensive.

This creates organizational pressure to pull together things that might otherwise be separate. If distributed resources can be logically centralized and managed within a common organization, it reduces the dependence on outside teams. The removal of boundaries from common workflows lowers organizational friction and makes easier the overall task of managing the infrastructure.

Physically separate

At the same time that forces are pulling things together, there are equally strong oppositional forces exerting outward pressure on datacenter resources.

Business continuity - For many companies, the datacenter represents a mission critical element of their infrastructure. For companies whose existence depends on the presence of the resources within the datacenter (be they data, servers, or applications), it is untenably risky to rely on a single physical site. This exerts an outward force on resources as companies must create multiple physical sites, typically separated by enough distance that a disaster would not meaningfully impact all sites.

Despite the operational desire to keep things together, the risk to the business dictates that resources be physically separate.

Natural expansion - As resources are added to a datacenter, they are typically installed in racks in relative close proximity to each other. When racks are empty, there is no reason to unnecessarily create physical separation between resources working in concert. Over time, adjacent rack space is filled through the natural expansion of compute, storage, and networking capacity.

As equipment expands, available rack space is depleted, and new racks and rows are populated. Eventually, the device sprawl can occupy entire data centers.

Imagine now that a cluster of servers occupies a rack in one corner of the datacenter. If that cluster is to be expanded, where does the next server go? If the nearby racks are already built out, that resource must be installed some physical distance away from the resources with which it must coordinate.

It is near impossible to plan for all future growth at the time of datacenter inception. Leaving enough space in adjacent racks to account for a decade of growth is impractically expensive. A sparsely populated datacenter suffers from poor space utilization, challenging power distribution, and difficult cabling. Thus, the mere act of expansion actually exerts an outward force leading to physically distributed resources.

Real estate - Sometimes, even when architects want to keep resources together, physical limitations create problems. There is no more immovable object than real estate (which serves as a proxy for all of space, power, and HVAC). In some cases, it is impossible to build out either laterally or even up. In other cases, there is no additional power to be had from the grid. Either of these scenarios forces an expansion to another site, which requires the physical separation of resources that might be expected to function in concert.

Additionally, as land rates change and technologies evolve, the best spots for data centers are not always known. It is difficult at best to predict with enough certainty how a physical site will evolve over an arbitrarily long time horizon. For example, not long ago, the thought of building cooling-hungry data centers in the hot desert was foreign. Today, Las Vegas is home to some of the most cutting edge facilities in the world. This means that geographical dispersion is likely a certainty for large companies. The forces pulling resources physically apart are unlikely to be neutralized.

Finding a balance

Given the strong forces working to keep resources logically together and the equally strong forces keeping them physically separate, how does anyone find a balance?

The price for balance is cost and complexity. You pay for reach directly, and control requires complexity. Both translate into higher carrying costs for the infrastructure. The push-pull dynamic in datacenters is not going away anytime soon. In fact, a move towards more distributed applications will only make harder the balancing act that already exists.

Newer technology offerings like SDN and datacenter fabrics offer some hope, but only insofar as they offer alternatives to the existing problems. Whatever the solution, architects will need to evaluate approaches based not just on the features but on the long-term costs of those features.

[Today’s fun fact: “Way” is the most frequently used noun in the English language. No way!]

The post Datacenter architecture: Together and apart appeared first on Plexxi.

More Stories By Michael Bushong

The best marketing efforts leverage deep technology understanding with a highly-approachable means of communicating. Plexxi's Vice President of Marketing Michael Bushong has acquired these skills having spent 12 years at Juniper Networks where he led product management, product strategy and product marketing organizations for Juniper's flagship operating system, Junos. Michael spent the last several years at Juniper leading their SDN efforts across both service provider and enterprise markets. Prior to Juniper, Michael spent time at database supplier Sybase, and ASIC design tool companies Synopsis and Magma Design Automation. Michael's undergraduate work at the University of California Berkeley in advanced fluid mechanics and heat transfer lend new meaning to the marketing phrase "This isn't rocket science."

Latest Stories
In a world of ever-accelerating business cycles and fast-changing client expectations, the cloud increasingly serves as a growth engine and a path to new business models. Dynamic clouds enable businesses to continuously reinvent themselves, adapting their business processes, their service and software delivery and their operations to achieve speed-to-market and quick response to customer feedback. As the cloud evolves, the industry has multiple competing cloud technologies, offering on-premises ...
SYS-CON Events announced today that SUSE, a pioneer in open source software, will exhibit at SYS-CON's DevOps Summit 2015 New York, which will take place June 9-11, 2015, at the Javits Center in New York City, NY. SUSE provides reliable, interoperable Linux, cloud infrastructure and storage solutions that give enterprises greater control and flexibility. More than 20 years of engineering excellence, exceptional service and an unrivaled partner ecosystem power the products and support that help ...
As the world moves from DevOps to NoOps, application deployment to the cloud ought to become a lot simpler. However, applications have been architected with a much tighter coupling than it needs to be which makes deployment in different environments and migration between them harder. The microservices architecture, which is the basis of many new age distributed systems such as OpenStack, Netflix and so on is at the heart of CloudFoundry – a complete developer-oriented Platform as a Service (PaaS...
T-Mobile has been transforming the wireless industry with its “Uncarrier” initiatives. Today as T-Mobile’s IT organization works to transform itself in a like manner, technical foundations built over the last couple of years are now key to their drive for more Agile delivery practices. In his session at DevOps Summit, Martin Krienke, Sr Development Manager at T-Mobile, will discuss where they started their Continuous Delivery journey, where they are today, and where they are going in an effort ...
SAP is delivering break-through innovation combined with fantastic user experience powered by the market-leading in-memory technology, SAP HANA. In his General Session at 15th Cloud Expo, Thorsten Leiduck, VP ISVs & Digital Commerce, SAP, discussed how SAP and partners provide cloud and hybrid cloud solutions as well as real-time Big Data offerings that help companies of all sizes and industries run better. SAP launched an application challenge to award the most innovative SAP HANA and SAP HANA...
There is no question that the cloud is where businesses want to host data. Until recently hypervisor virtualization was the most widely used method in cloud computing. Recently virtual containers have been gaining in popularity, and for good reason. In the debate between virtual machines and containers, the latter have been seen as the new kid on the block – and like other emerging technology have had some initial shortcomings. However, the container space has evolved drastically since coming on...
The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long development cycles that produce software that is obsolete at launch. DevOps may be disruptive, but it is essential. The DevOps Summit at Cloud Expo – to be held June 3-5, 2015, at the Javits Center in New York City – will expand the DevOps community, enable a wide...
P2P RTC will impact the landscape of communications, shifting from traditional telephony style communications models to OTT (Over-The-Top) cloud assisted & PaaS (Platform as a Service) communication services. The P2P shift will impact many areas of our lives, from mobile communication, human interactive web services, RTC and telephony infrastructure, user federation, security and privacy implications, business costs, and scalability. In his session at @ThingsExpo, Robin Raymond, Chief Architect...
The web app is Agile. The REST API is Agile. The testing and planning are Agile. But alas, Data infrastructures certainly are not. Once an application matures, changing the shape or indexing scheme of data often forces at best a top down planning exercise and at worst includes schema changes which force downtime. The time has come for a new approach that fundamentally advances the agility of distributed data infrastructures. Come learn about a new solution to the problems faced by software orga...
Explosive growth in connected devices. Enormous amounts of data for collection and analysis. Critical use of data for split-second decision making and actionable information. All three are factors in making the Internet of Things a reality. Yet, any one factor would have an IT organization pondering its infrastructure strategy. How should your organization enhance its IT framework to enable an Internet of Things implementation? In his session at Internet of @ThingsExpo, James Kirkland, Chief Ar...
The world is at a tipping point where the technology, the device and global adoption are converging to such a point that we will see an explosion of a world where smartphone devices not only allow us to talk to each other, but allow for communication between everything – serving as a central hub from which we control our world – MediaTek is at the heart of both driving this and allowing the markets to drive this reality forward themselves. The next wave of consumer gadgets is here – smart, con...
Cloud Expo, Inc. has announced today that Andi Mann returns to DevOps Summit 2015 as Conference Chair. The 4th International DevOps Summit will take place on June 9-11, 2015, at the Javits Center in New York City. "DevOps is set to be one of the most profound disruptions to hit IT in decades," said Andi Mann. "It is a natural extension of cloud computing, and I have seen both firsthand and in independent research the fantastic results DevOps delivers. So I am excited to help the great team at ...
Container technology is sending shock waves through the world of cloud computing. Heralded as the 'next big thing,' containers provide software owners a consistent way to package their software and dependencies while infrastructure operators benefit from a standard way to deploy and run them. Containers present new challenges for tracking usage due to their dynamic nature. They can also be deployed to bare metal, virtual machines and various cloud platforms. How do software owners track the usag...
CA Technologies has announced it has signed a definitive agreement to acquire Rally Software Development Corp. for $19.50 per share, which equates to approximately $480 million, net of cash acquired. The transaction has been unanimously approved by both Boards of Directors, and is expected to close in the second quarter of CA’s fiscal 2016. Based in Boulder, CO, Rally has approximately 500 employees across four continents and FY 2015 sales of $88 million. “Software applications are changing the...
The security devil is always in the details of the attack: the ones you've endured, the ones you prepare yourself to fend off, and the ones that, you fear, will catch you completely unaware and defenseless. The Internet of Things (IoT) is nothing if not an endless proliferation of details. It's the vision of a world in which continuous Internet connectivity and addressability is embedded into a growing range of human artifacts, into the natural world, and even into our smartphones, appliances, a...