|By Stefan Bernbo||
|April 4, 2014 12:30 PM EDT||
For an example of just how dramatically storage has changed over the past fifteen years, consider your music collection. At one point, you had a collection of cassettes that stored the song files on tape. As the years went on, your hairstyle changed and you bought a CD player that used a spinning disk to store more song data at a higher quality than tape could. Spinning disks flourished well into the MP3 player era, surviving even the initial introduction of flash storage due to its competitive cost. Eventually, however, your newest smartphone or iPod shipped with flash storage instead, as manufacturers bowed to its improved performance over disk storage and its increasingly competitive price point.
This is an example of a sea change taking place at a much bigger scale as well. Instead of gigabytes, think petabytes.
The data center infrastructures designed by telcos, service providers and major enterprises to store massive quantities of data have lately used predominantly disk storage in their servers, sometimes blending in flash storage for performance-intensive tasks. While the speed and performance of flash storage has tempted data center architects to deploy it more widely throughout the data center, it has only been recently that the price of flash has decreased enough to make its broader use a viable option.
To understand why flash storage has suddenly become a practical choice for data center architects across industries, it is helpful to examine the differences between flash and disk storage.
The Next Big Thing, Again
As the example above shows, when it was introduced, disk storage represented leaps and bounds of progress in speed and efficiency compared to tape storage, the predominant method of the time. Even after flash was introduced to the market, disk storage remained the server architecture of choice. Flash did deliver substantially higher performance, but was priced too high to ever present a real threat to the prevalence of spinning disks. In addition, flash drives were smaller in capacity and not able to store as much data per unit as spinning disks at the same value.
However, new improvements in flash have slashed its price significantly, positioning it as a true data center hardware alternative whose benefits - speed in throughput and latency - have dramatically increased at the same time. As an added plus, flash is highly energy efficient, needing only a fraction of the power needed by disk storage, sometimes at the ratio of one to 16. Flash drives still break down at a faster rate than does disk storage, but its boosts in performance and drop in price in recent years have made flash a realistic and highly attractive option for data center architecture and design needs.
Making the Switch
In fact, it's increasingly feasible that today's data center - still reliant on disk storage - could use 100 percent flash storage tomorrow. Telcos, service providers, major enterprises and other major companies whose profits are tied to the speed and availability they can provide to their customer base, are beginning to look at flash storage's blistering performance as less of a "nice to have" option and more of a core technology necessary to maintaining a competitive edge.
While the high-performance-demanding industries of telco and service providers are diving into flash straight away, vendors in other vertical markets have made cost-benefit calculations and have elected to hold back until the price of flash storage drops even further. For example, a Dropbox-style file hosting service for consumer cloud storage isn't as likely to be motivated by fast performance as it would be with ensuring the availability of cheap storage at scale. Companies like these are making the usual tradeoff in storage: finding a comfortable place between price and capacity. However, when the price of flash finally descends to that of disk storage, the last barrier will be removed for those companies that want to remain competitive. When this last milestone finally happens, the market shift will be as significant as when disks replaced tape storage by beating it on the same markers: higher performance and better pricing.
Advancements in Software
One of the trends making this shift possible is that of software-defined storage. By adopting a software-defined approach to storage infrastructure, organizations have the flexibility to deploy flash storage throughout their data center architectures quickly and easily.
As background, the concept of software-defined storage seeks to move functions and features from the hardware layer to the software layer. This approach removes the dependence on expensive and annoying redundancies that solve issues based in the hardware layer. Data center architects must also plan for the inevitable failure of hardware. Flash storage, in particular, currently has a faster time-to-failure rate than disk does. In storage environments that don't use RAID cards, the failure of a disk prompts an error that will impact the end-user's experience. To solve this, architects will build in expensive and redundant RAID cards to hide the errors. By using the right software-defined strategy, these problems can be absorbed and made invisible to the end user. Since software-defined storage is hardware-agnostic, it can run on any hardware configuration.
There are a number of additional benefits that telcos and service provider data center architects can achieve by combining software-defined storage with flash hardware. For instance, the organization could still utilize a single name space spanning all its storage nodes if it were to use a software-defined storage approach. In addition, it could also run applications in the storage nodes as well, creating new "compustorage" nodes instead. As a result, the storage hardware wouldn't need to be big or costly, but could still have very high performance and speed. Organizations can start with a small number of cheap servers instead of building a large, expensive and traditional installation, and still scale linearly as needed.
Benefits of a software-defined approach to an all-flash data center are:
- Huge performance improvement through the ability to use the faster flash technology throughout the data center.
- Lower power consumption means that SSDs reduce running costs, generating far less heat than a spinning disk and requiring less energy for cooling.
- SSDs deliver a smaller footprint in the data center. Since SSDs are much smaller than spinning disks, they require less space and less real estate to house them.
- Running more applications on the same hardware, due to hardware performance gains.
Even as many of us still listen to CDs in the car, the music industry is inevitably shifting to a new paradigm built on music files saved on flash storage. The trend is repeating across industries, but nowhere as dramatically as it is in the data center. Flash storage - with its extreme performance, efficient energy usage and increasingly competitive cost - will eventually become the industry status quo. By using software-defined storage, data center architects can design a flexible, efficient and powerful framework for telcos, service providers and major enterprises looking to get the most powerful and energy-efficient data center possible by using all flash.
The speed of product development has increased massively in the past 10 years. At the same time our formal secure development and SDL methodologies have fallen behind. This forces product developers to choose between rapid release times and security. In his session at DevOps Summit, Michael Murray, Director of Cyber Security Consulting and Assessment at GE Healthcare, examined the problems and presented some solutions for moving security into the DevOps lifecycle to ensure that we get fast AND ...
Mar. 4, 2015 06:00 AM EST Reads: 2,741
One of the biggest impacts of the Internet of Things is and will continue to be on data; specifically data volume, management and usage. Companies are scrambling to adapt to this new and unpredictable data reality with legacy infrastructure that cannot handle the speed and volume of data. In his session at @ThingsExpo, Don DeLoach, CEO and president of Infobright, will discuss how companies need to rethink their data infrastructure to participate in the IoT, including: Data storage: Understand...
Mar. 4, 2015 05:00 AM EST Reads: 2,657
Docker is becoming very popular--we are seeing every major private and public cloud vendor racing to adopt it. It promises portability and interoperability, and is quickly becoming the currency of the Cloud. In his session at DevOps Summit, Bart Copeland, CEO of ActiveState, discussed why Docker is so important to the future of the cloud, but will also take a step back and show that Docker is actually only one piece of the puzzle. Copeland will outline the bigger picture of where Docker fits a...
Mar. 4, 2015 04:00 AM EST Reads: 3,147
The Workspace-as-a-Service (WaaS) market will grow to $6.4B by 2018. In his session at 16th Cloud Expo, Seth Bostock, CEO of IndependenceIT, will begin by walking the audience through the evolution of Workspace as-a-Service, where it is now vs. where it going. To look beyond the desktop we must understand exactly what WaaS is, who the users are, and where it is going in the future. IT departments, ISVs and service providers must look to workflow and automation capabilities to adapt to growing ...
Mar. 4, 2015 04:00 AM EST Reads: 1,098
Sensor-enabled things are becoming more commonplace, precursors to a larger and more complex framework that most consider the ultimate promise of the IoT: things connecting, interacting, sharing, storing, and over time perhaps learning and predicting based on habits, behaviors, location, preferences, purchases and more. In his session at @ThingsExpo, Tom Wesselman, Director of Communications Ecosystem Architecture at Plantronics, will examine the still nascent IoT as it is coalescing, includin...
Mar. 4, 2015 03:30 AM EST Reads: 2,733
The Internet of Things (IoT) promises to evolve the way the world does business; however, understanding how to apply it to your company can be a mystery. Most people struggle with understanding the potential business uses or tend to get caught up in the technology, resulting in solutions that fail to meet even minimum business goals. In his session at @ThingsExpo, Jesse Shiah, CEO / President / Co-Founder of AgilePoint Inc., showed what is needed to leverage the IoT to transform your business. ...
Mar. 4, 2015 02:45 AM EST Reads: 3,822
Hadoop as a Service (as offered by handful of niche vendors now) is a cloud computing solution that makes medium and large-scale data processing accessible, easy, fast and inexpensive. In his session at Big Data Expo, Kumar Ramamurthy, Vice President and Chief Technologist, EIM & Big Data, at Virtusa, will discuss how this is achieved by eliminating the operational challenges of running Hadoop, so one can focus on business growth. The fragmented Hadoop distribution world and various PaaS soluti...
Mar. 4, 2015 02:30 AM EST Reads: 1,186
The true value of the Internet of Things (IoT) lies not just in the data, but through the services that protect the data, perform the analysis and present findings in a usable way. With many IoT elements rooted in traditional IT components, Big Data and IoT isn’t just a play for enterprise. In fact, the IoT presents SMBs with the prospect of launching entirely new activities and exploring innovative areas. CompTIA research identifies several areas where IoT is expected to have the greatest impac...
Mar. 4, 2015 02:00 AM EST Reads: 3,127
Advanced Persistent Threats (APTs) are increasing at an unprecedented rate. The threat landscape of today is drastically different than just a few years ago. Attacks are much more organized and sophisticated. They are harder to detect and even harder to anticipate. In the foreseeable future it's going to get a whole lot harder. Everything you know today will change. Keeping up with this changing landscape is already a daunting task. Your organization needs to use the latest tools, methods and ex...
Mar. 4, 2015 01:30 AM EST Reads: 3,526
In his session at DevOps Summit, Tapabrata Pal, Director of Enterprise Architecture at Capital One, will tell a story about how Capital One has embraced Agile and DevOps Security practices across the Enterprise – driven by Enterprise Architecture; bringing in Development, Operations and Information Security organizations together. Capital Ones DevOpsSec practice is based upon three "pillars" – Shift-Left, Automate Everything, Dashboard Everything. Within about three years, from 100% waterfall, C...
Mar. 4, 2015 01:00 AM EST Reads: 4,428
Disruptive macro trends in technology are impacting and dramatically changing the "art of the possible" relative to supply chain management practices through the innovative use of IoT, cloud, machine learning and Big Data to enable connected ecosystems of engagement. Enterprise informatics can now move beyond point solutions that merely monitor the past and implement integrated enterprise fabrics that enable end-to-end supply chain visibility to improve customer service delivery and optimize sup...
Mar. 4, 2015 12:30 AM EST Reads: 3,553
Wearable devices have come of age. The primary applications of wearables so far have been "the Quantified Self" or the tracking of one's fitness and health status. We propose the evolution of wearables into social and emotional communication devices. Our BE(tm) sensor uses light to visualize the skin conductance response. Our sensors are very inexpensive and can be massively distributed to audiences or groups of any size, in order to gauge reactions to performances, video, or any kind of present...
Mar. 4, 2015 12:00 AM EST Reads: 3,093
Even as cloud and managed services grow increasingly central to business strategy and performance, challenges remain. The biggest sticking point for companies seeking to capitalize on the cloud is data security. Keeping data safe is an issue in any computing environment, and it has been a focus since the earliest days of the cloud revolution. Understandably so: a lot can go wrong when you allow valuable information to live outside the firewall. Recent revelations about government snooping, along...
Mar. 3, 2015 11:15 PM EST Reads: 759
SYS-CON Events announced today that Dyn, the worldwide leader in Internet Performance, will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. Dyn is a cloud-based Internet Performance company. Dyn helps companies monitor, control, and optimize online infrastructure for an exceptional end-user experience. Through a world-class network and unrivaled, objective intelligence into Internet conditions, Dyn ensures...
Mar. 3, 2015 09:15 PM EST Reads: 896
Business and IT leaders today need better application delivery capabilities to support critical new innovation. But how often do you hear objections to improving application delivery like, “I can harden it against attack, but not on this timeline”; “I can make it better, but it will cost more”; “I can deliver faster, but not with these specs”; or “I can stay strong on cost control, but quality will suffer”? In the new application economy, these tradeoffs are no longer acceptable. Customers will ...
Mar. 3, 2015 07:30 PM EST Reads: 963