Click here to close now.

SYS-CON MEDIA Authors: Plutora Blog, Bart Copeland, Andreas Grabner, Liz McMillan, Dana Gardner

News Feed Item

451 Research Publishes In-Depth Server and Virtualization Study

Server virtualization projects continue driving activity and spending across the IT marketplace

NEW YORK, Jan. 15, 2013 /PRNewswire/ -- TheInfoPro, a service of 451 Research, released its latest Servers and Virtualization Study, indicating a major refresh of x86 server infrastructure and the associated network, storage and software technologies required to optimize performance in virtualized, cloud-ready datacenters. Conducted during the second half of 2012, TheInfoPro study identifies key initiatives of senior server infrastructure managers and examines market factors and major players. This annual study is based on extensive live interviews with server professionals and primary decision-makers at large and midsize enterprises in North America and Europe.

Highlights from the TheInfoPro Servers and Virtualization Study include:

  • Server virtualization projects are still driving activity and spending across much of the IT marketplace, with less than a third of respondents considering their environments to be sufficiently virtualized.

  • The majority of respondents are undertaking a major refresh of their x86 server infrastructures together with the network and storage technologies that are required to optimize performance in virtualized, cloud-ready datacenters.

  • In the x86 environment, which represents more than 80% of respondents' computing capacity, average virtualization levels have increased 13% from last year to 51%, with a notable increase at the higher levels, roughly doubling the number of organizations virtualizing production applications.

  • The complexity and interdependency of storage, network, server and software in virtualized environments is driving interest in 'integrated infrastructure' solutions, which include unified computing and converged and appliance-oriented infrastructure. In these categories, general-purpose offerings – especially those that are composed of multivendor components – are gaining favor, with offerings from Cisco and its array of partners being the most widely mentioned by respondents.

  • From the software perspective, attention is switching from base virtualization capabilities to the automation tools required to manage production workloads in virtualized environments: service catalogs, usage-based reporting and accounting (show-back), service-level monitoring tools and runbook or script-based automation and provisioning.

  • With most organizations embroiled in virtualizing business-critical production workloads, it is hardly surprising that vendors closely associated with the technologies required to build cloud-ready, virtualized datacenters top the list of exciting vendors. This strongly favors VMware as the dominant virtualization provider for x86-based infrastructure, and Cisco for hardware vendors. Both vendors also top TheInfoPro customer ratings for promise and fulfillment.

"Server virtualization projects are still dominating IT activity, creating a one-time spending bubble as organizations lay down the foundation for a cloud-ready infrastructure," said Peter ffoulkes, TheInfoPro's Research Director for Servers and Virtualization. "Complexity is driving interest in converged infrastructure solutions, with 13% of respondents planning to implement the technology for the first time within the next two years."

Research Director and report author Peter ffoulkes will host a 451 Research Innovation webinar on January 31 to discuss the report's findings.

Webinar Details:

About TheInfoPro Servers and Virtualization Study
TheInfoPro's Servers and Virtualization Study takes an in-depth look at key industry trends and tracks the performance of individual vendors. Now in its ninth year, this study was finalized in December 2012. TheInfoPro's methodology uses extensive interviews with a proprietary network of IT professionals and key decision-makers at large and midsize enterprises. Each interview explores several fundamental areas, including the implementation and spending plans for more than 30 technologies, evaluations of vendors observed from business and product perspectives, macro IT influences transforming the sector, and factors affecting decision-making processes. Results are collated into comprehensive research reports providing business intelligence in the form of technological roadmaps, budget trends and vendor spending plans and performance ratings.  A sampling of vendors covered in the Vendor Performance and Technology Roadmap components of the study include BMC Software, CA, Cisco, Citrix, Dell, EMC, HP, IBM, Microsoft, Novell, Oracle, Red Hat, ServiceNow, SolarWinds, VCE and VMware.

About 451 Research
451 Research, a division of The 451 Group, is focused on the business of enterprise IT innovation. The company's analysts provide critical and timely insight into the competitive dynamics of innovation in emerging technology segments. Business value is delivered via daily concise and insightful published research, periodic deeper-dive reports, data tools, market-sizing research, analyst advisory, and conferences and events. Clients of the company – at vendor, investor, service-provider and end-user organizations – rely on 451 Research's insight to support both strategic and tactical decision-making. 451 Research is headquartered in New York, with offices in key locations, including San Francisco, Washington DC, London, Boston, Seattle and Denver.

Media Contacts:
Newsmaker Group for 451 Research:
Jennifer Fugel
845.657.4202
[email protected] 
 OR
Lynn Schwartz
973.736.7118
[email protected]

This press release was issued through eReleases® Press Release Distribution. For more information, visit http://www.ereleases.com

 

SOURCE 451 Research

More Stories By PR Newswire

Copyright © 2007 PR Newswire. All rights reserved. Republication or redistribution of PRNewswire content is expressly prohibited without the prior written consent of PRNewswire. PRNewswire shall not be liable for any errors or delays in the content, or for any actions taken in reliance thereon.

Latest Stories
The explosion of connected devices / sensors is creating an ever-expanding set of new and valuable data. In parallel the emerging capability of Big Data technologies to store, access, analyze, and react to this data is producing changes in business models under the umbrella of the Internet of Things (IoT). In particular within the Insurance industry, IoT appears positioned to enable deep changes by altering relationships between insurers, distributors, and the insured. In his session at @Things...
PubNub on Monday has announced that it is partnering with IBM to bring its sophisticated real-time data streaming and messaging capabilities to Bluemix, IBM’s cloud development platform. “Today’s app and connected devices require an always-on connection, but building a secure, scalable solution from the ground up is time consuming, resource intensive, and error-prone,” said Todd Greene, CEO of PubNub. “PubNub enables web, mobile and IoT developers building apps on IBM Bluemix to quickly add sc...
The cloud has transformed how we think about software quality. Instead of preventing failures, we must focus on automatic recovery from failure. In other words, resilience trumps traditional quality measures. Continuous delivery models further squeeze traditional notions of quality. Remember the venerable project management Iron Triangle? Among time, scope, and cost, you can only fix two or quality will suffer. Only in today's DevOps world, continuous testing, integration, and deployment upend...
Data-intensive companies that strive to gain insights from data using Big Data analytics tools can gain tremendous competitive advantage by deploying data-centric storage. Organizations generate large volumes of data, the vast majority of which is unstructured. As the volume and velocity of this unstructured data increases, the costs, risks and usability challenges associated with managing the unstructured data (regardless of file type, size or device) increases simultaneously, including end-to-...
Sensor-enabled things are becoming more commonplace, precursors to a larger and more complex framework that most consider the ultimate promise of the IoT: things connecting, interacting, sharing, storing, and over time perhaps learning and predicting based on habits, behaviors, location, preferences, purchases and more. In his session at @ThingsExpo, Tom Wesselman, Director of Communications Ecosystem Architecture at Plantronics, will examine the still nascent IoT as it is coalescing, includin...
The excitement around the possibilities enabled by Big Data is being tempered by the daunting task of feeding the analytics engines with high quality data on a continuous basis. As the once distinct fields of data integration and data management increasingly converge, cloud-based data solutions providers have emerged that can buffer your organization from the complexities of this continuous data cleansing and management so that you’re free to focus on the end goal: actionable insight.
Between the compelling mockups and specs produced by your analysts and designers, and the resulting application built by your developers, there is a gulf where projects fail, costs spiral out of control, and applications fall short of requirements. In his session at DevOps Summit, Charles Kendrick, CTO and Chief Architect at Isomorphic Software, will present a new approach where business and development users collaborate – each using tools appropriate to their goals and expertise – to build mo...
Operational Hadoop and the Lambda Architecture for Streaming Data Apache Hadoop is emerging as a distributed platform for handling large and fast incoming streams of data. Predictive maintenance, supply chain optimization, and Internet-of-Things analysis are examples where Hadoop provides the scalable storage, processing, and analytics platform to gain meaningful insights from granular data that is typically only valuable from a large-scale, aggregate view. One architecture useful for capturing...
The Internet of Things (IoT) is causing data centers to become radically decentralized and atomized within a new paradigm known as “fog computing.” To support IoT applications, such as connected cars and smart grids, data centers' core functions will be decentralized out to the network's edges and endpoints (aka “fogs”). As this trend takes hold, Big Data analytics platforms will focus on high-volume log analysis (aka “logs”) and rely heavily on cognitive-computing algorithms (aka “cogs”) to mak...
When it comes to the Internet of Things, hooking up will get you only so far. If you want customers to commit, you need to go beyond simply connecting products. You need to use the devices themselves to transform how you engage with every customer and how you manage the entire product lifecycle. In his session at @ThingsExpo, Sean Lorenz, Technical Product Manager for Xively at LogMeIn, will show how “product relationship management” can help you leverage your connected devices and the data th...
In the consumer IoT, everything is new, and the IT world of bits and bytes holds sway. But industrial and commercial realms encompass operational technology (OT) that has been around for 25 or 50 years. This grittier, pre-IP, more hands-on world has much to gain from Industrial IoT (IIoT) applications and principles. But adding sensors and wireless connectivity won’t work in environments that demand unwavering reliability and performance. In his session at @ThingsExpo, Ron Sege, CEO of Echelon...
With several hundred implementations of IoT-enabled solutions in the past 12 months alone, this session will focus on experience over the art of the possible. Many can only imagine the most advanced telematics platform ever deployed, supporting millions of customers, producing tens of thousands events or GBs per trip, and hundreds of TBs per month. With the ability to support a billion sensor events per second, over 30PB of warm data for analytics, and hundreds of PBs for an data analytics arc...
One of the biggest impacts of the Internet of Things is and will continue to be on data; specifically data volume, management and usage. Companies are scrambling to adapt to this new and unpredictable data reality with legacy infrastructure that cannot handle the speed and volume of data. In his session at @ThingsExpo, Don DeLoach, CEO and president of Infobright, will discuss how companies need to rethink their data infrastructure to participate in the IoT, including: Data storage: Understand...
Since 2008 and for the first time in history, more than half of humans live in urban areas, urging cities to become “smart.” Today, cities can leverage the wide availability of smartphones combined with new technologies such as Beacons or NFC to connect their urban furniture and environment to create citizen-first services that improve transportation, way-finding and information delivery. In her session at @ThingsExpo, Laetitia Gazel-Anthoine, CEO of Connecthings, will focus on successful use c...
Sensor-enabled things are becoming more commonplace, precursors to a larger and more complex framework that most consider the ultimate promise of the IoT: things connecting, interacting, sharing, storing, and over time perhaps learning and predicting based on habits, behaviors, location, preferences, purchases and more. In his session at @ThingsExpo, Tom Wesselman, Director of Communications Ecosystem Architecture at Plantronics, will examine the still nascent IoT as it is coalescing, includin...