Click here to close now.

SYS-CON MEDIA Authors: Andreas Grabner, Liz McMillan, Dana Gardner, Elizabeth White, Cloud Best Practices Network

Related Topics: Cloud Expo, SOA & WOA, Virtualization

Cloud Expo: Blog Feed Post

Useful Cloud Advice, Part One. Storage

Cloud Storage is here, and seems to be pretty ready for prime-time if you stick with the major providers

There’s a whole lot of talk about cloud revolutionizing IT, and a whole lot of argument about public versus private cloud, even a considerable amount of talk about what belongs in the cloud. But not much talk about helping you determine what applications and storage are a good candidate to move there – considering all of the angles that matter to IT.  This blog will focus on storage, the next one on applications, because I don’t want to bury you in a blog as long as a feature length article.

It amazes me when I see comments like “no one needs a datacenter” while the definition of what, exactly, cloud is still carries the weight of debate. For the purposes of this blog, we will limit the definition of cloud to Infrastructure as a Service (IaaS) – VM containers and the things to support them, or Storage containers and the things to support them. My reasoning is simple, in that the only other big category of “cloud” at the moment is SaaS, and since SaaS has been around for about a decade, you should already have a decision making process for outsourcing a given application to such a service. Salesforce.com and Google Docs are examples of what is filtered out by saying “SaaS is a different beast”. Hosting services and CDNs are a chunk of the market, but increasingly they are starting to look like IaaS, as they add functionality to meet the demands of their IT customers. So we’ll focus on the “new and shiny” aspects of cloud that have picked up a level of mainstream support.


STORAGE

Cloud Storage is here, and seems to be pretty ready for prime-time if you stick with the major providers. That’s good news, since instability in such a young market could spell its doom. The biggest problem with cloud storage is the fact that it is implemented as SOAP calls in almost every large provider. This was an architectural decision that had to be made, but between CIFS, NFS, iSCSI, and FCoE, it wasn’t necessary to choose a web service interface, so I’m unsure why nearly all vendors did. Conveniently, nearly immediately after it was clear the enterprise needed cloud storage to look like all the other storage in their datacenters, cloud storage gateways hit the market. In most cases these were brand new products by brand new companies, but some, like our ARX Cloud Extender, are additions to existing storage products that help leverage both internal and external assets. Utilizing a cloud storage gateway allows you to treat cloud storage like NAS or SAN disk (depending upon the gateway and cloud provider), which really opens it up to whatever use you need to make of the storage within your enterprise.

Issues.

Generally speaking, WAN communications are slower than LAN communications. You can concoct situations where the LAN is slower, but most of us definitely do not live in those worlds. While compression and deduplication of data going over the WAN can help, it is best to assume that storage in the cloud will have slower access times than anything but tape sitting in your datacenter.

Also, if you’re going to be storing data on the public Internet, it needs to be protected from prying eyes. Once it leaves your building, there are so many places/ways it can be viewed, that the only real option is to encrypt on the way out. Products like ARX Cloud Extender take care of that for you, check with your cloud storage gateway vendor though to be certain they do the same. If they do not, an encryption engine will have to be installed to take care of encryption before sending out of the datacenter. If you end up having to do this, be aware that encryption can have huge impacts on compression and deduplication, because it changes the bytes themselves. Purchasing a gateway that does both makes order of operations work correctly – dedupe, then compress, then encrypt (assuming it supports all three).

With that said, utilizing compression, TCP optimizations, and deduplication will reduce the performance hit to manageable levels, and encryption mitigates the data-in-flight security risk going to the cloud, and the data-at-rest security risk while stored in the cloud. Between the two, it makes cloud storage appealing – or at least usable.

Best Fit.

Our criteria then become pretty straight-forward… We want to send data to the cloud that either has no access time constraints (eg: can be slower to open), or will be faster than existing technology.These criteria leave you a couple of solid options for piloting cloud storage usage.

1. Backups or Archival Storage. They’re stored off-site, they’re encrypted, and even though they’re slower than LAN-based backups, they’re still disk to disk (D2D), so they’re faster backing up, and much faster for selective restores than tape libraries. If you already have a D2D solution in-house, this may be less appealing, but getting a copy into the cloud means that one major disaster can’t take out your primary datacenter and its backups.

2. Infrequently Accessed or Low Priority Storage. All that chaff that comes along with the goldmine of unstructured data in your organization is kept, because you will need some of it again some day, and IT is not in a position to predict which files you will need. By setting up a cloud storage Share or LUN that you use tiering or some other mechanism to direct those files to, the files are saved, but they’re not chewing up local disk. That increases available disk in the datacenter, but keeps the files available in much the same manner as archival storage. Implemented in conjunction with Directory Virtualization, the movement of these files can be invisible to the end users, as they will still appear in the same place in the global directory, but will physically be moved to the cloud if they are infrequently accessed.

Worst Fit.

Cloud storage is no more a panacea than any other technical solution, there’s just some stuff that should not be moved to the cloud today, perhaps not ever.

1. Access time sensitive files. Don’t put your database files in the cloud (though you might check out Microsoft Azure or Oracle’s equivalent offering). You won’t like the results. Remember, just because you can, doesn’t mean you should.

2. Data Critical to Business Continuity. Let’s face it, one major blow that takes out your WAN connection takes out your ability to access what’s stored in the cloud. So be careful that data needed for normal operation of the business is not off-site. It’s bad enough if access to the Internet is down, and public websites running on datacenter servers are inaccessible, but to have those files critical to the business – be it phone orders, customer support, whatever – must be available if the business is keeping its doors open. Redundant WAN connections can mitigate this issue (with a pricetag of course), but even those are not proof against all eventualities that impact only your Internet connectivity.


STORAGE SUMMARY

With cloud storage gateways and directory virtualization, there are definite “win” points for the use of cloud storage. Without directory virtualization, there are still some definite scenarios that will improve your storage architecture without breaking your back implementing them. Without a cloud storage gateway, most cloud storage is essentially useless to enterprise IT (other than AppDev) because none of your architecture knows how to make use of web services APIs.

But if you implement a cloud storage gateway, and choose wisely, you can save more in storage upgrades than the cost of the cloud storage. This is certainly true when you would have to upgrade local storage, and cloud storage just grows a little with a commensurate monthly fee. Since fee schedules and even what is charged for (bytes in/out, bytes at rest, phase of the moon) change over time, check with your preferred vendor to make certain cloud is the best option for your scenario, but remember that deploying a directory virtualization tool will increase utilization and tiering can help remove data from expensive tier one disk, possibly decreasing one of your most expensive architectural needs – the fast disk upgrade.

Next time we’ll look at applications from a general IT perspective, and I’m seriously considering extending this to a third blog discussing AppDev and the cloud.

Read the original blog entry...

More Stories By Don MacVittie

Don MacVittie is Founder of Ingrained Technology, LLC, specializing in Development, Devops, and Cloud Strategy. Previously, he was a Technical Marketing Manager at F5 Networks. As an industry veteran, MacVittie has extensive programming experience along with project management, IT management, and systems/network administration expertise.

Prior to joining F5, MacVittie was a Senior Technology Editor at Network Computing, where he conducted product research and evaluated storage and server systems, as well as development and outsourcing solutions. He has authored numerous articles on a variety of topics aimed at IT professionals. MacVittie holds a B.S. in Computer Science from Northern Michigan University, and an M.S. in Computer Science from Nova Southeastern University.

Latest Stories
SYS-CON Events announced today Arista Networks will exhibit at SYS-CON's DevOps Summit 2015 New York, which will take place June 9-11, 2015, at the Javits Center in New York City, NY. Arista Networks was founded to deliver software-driven cloud networking solutions for large data center and computing environments. Arista’s award-winning 10/40/100GbE switches redefine scalability, robustness, and price-performance, with over 3,000 customers and more than three million cloud networking ports depl...
Application metrics, logs, and business KPIs are a goldmine. It’s easy to get started with the ELK stack (Elasticsearch, Logstash and Kibana) – you can see lots of people coming up with impressive dashboards, in less than a day, with no previous experience. Going from proof-of-concept to production tends to be a bit more difficult, unfortunately, and it tends to gobble up our attention, time, and money. In his session at DevOps Summit, Otis Gospodnetić, co-author of Lucene in Action and founder...
The speed of software changes in growing and large scale rapid-paced DevOps environments presents a challenge for continuous testing. Many organizations struggle to get this right. Practices that work for small scale continuous testing may not be sufficient as the requirements grow. In his session at DevOps Summit, Marc Hornbeek, Sr. Solutions Architect of DevOps continuous test solutions at Spirent Communications, will explain the best practices of continuous testing at high scale, which is r...
Software is eating the world. Companies that were not previously in the technology space now find themselves competing with Google and Amazon on speed of innovation. As the innovation cycle accelerates, companies must embrace rapid and constant change to both applications and their infrastructure, and find a way to deliver speed and agility of development without sacrificing reliability or efficiency of operations. In her Day 2 Keynote DevOps Summit, Victoria Livschitz, CEO of Qubell, discussed...
Security can create serious friction for DevOps processes. We've come up with an approach to alleviate the friction and provide security value to DevOps teams. In her session at DevOps Summit, Shannon Lietz, Senior Manager of DevSecOps at Intuit, will discuss how DevSecOps got started and how it has evolved. Shannon Lietz has over two decades of experience pursuing next generation security solutions. She is currently the DevSecOps Leader for Intuit where she is responsible for setting and driv...
The explosion of connected devices / sensors is creating an ever-expanding set of new and valuable data. In parallel the emerging capability of Big Data technologies to store, access, analyze, and react to this data is producing changes in business models under the umbrella of the Internet of Things (IoT). In particular within the Insurance industry, IoT appears positioned to enable deep changes by altering relationships between insurers, distributors, and the insured. In his session at @Things...
NaviSite, Inc., a Time Warner Cable company, has opened a new enterprise-class data center located in Santa Clara, California. The new data center will enable NaviSite to meet growing demands for its enterprise-class Cloud and Managed Services from existing and new customers. This facility, which is owned by data center solution provider Digital Realty, will join NaviSite’s fabric of nine existing data centers across the U.S. and U.K., all of which are designed to provide a resilient, secure, hi...
SYS-CON Events announced today that Open Data Centers (ODC), a carrier-neutral colocation provider, will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place June 9-11, 2015, at the Javits Center in New York City, NY. Open Data Centers is a carrier-neutral data center operator in New Jersey and New York City offering alternative connectivity options for carriers, service providers and enterprise customers.
When it comes to the Internet of Things, hooking up will get you only so far. If you want customers to commit, you need to go beyond simply connecting products. You need to use the devices themselves to transform how you engage with every customer and how you manage the entire product lifecycle. In his session at @ThingsExpo, Sean Lorenz, Technical Product Manager for Xively at LogMeIn, will show how “product relationship management” can help you leverage your connected devices and the data th...
Thanks to Docker, it becomes very easy to leverage containers to build, ship, and run any Linux application on any kind of infrastructure. Docker is particularly helpful for microservice architectures because their successful implementation relies on a fast, efficient deployment mechanism – which is precisely one of the features of Docker. Microservice architectures are therefore becoming more popular, and are increasingly seen as an interesting option even for smaller projects, instead of bein...
SYS-CON Events announced today that CodeFutures, a leading supplier of database performance tools, has been named a “Sponsor” of SYS-CON's 16th International Cloud Expo®, which will take place on June 9–11, 2015, at the Javits Center in New York, NY. CodeFutures is an independent software vendor focused on providing tools that deliver database performance tools that increase productivity during database development and increase database performance and scalability during production.
The IoT market is projected to be $1.9 trillion tidal wave that’s bigger than the combined market for smartphones, tablets and PCs. While IoT is widely discussed, what not being talked about are the monetization opportunities that are created from ubiquitous connectivity and the ensuing avalanche of data. While we cannot foresee every service that the IoT will enable, we should future-proof operations by preparing to monetize them with extremely agile systems.
There’s Big Data, then there’s really Big Data from the Internet of Things. IoT is evolving to include many data possibilities like new types of event, log and network data. The volumes are enormous, generating tens of billions of logs per day, which raise data challenges. Early IoT deployments are relying heavily on both the cloud and managed service providers to navigate these challenges. Learn about IoT, Big Data and deployments processing massive data volumes from wearables, utilities and ot...
Even as cloud and managed services grow increasingly central to business strategy and performance, challenges remain. The biggest sticking point for companies seeking to capitalize on the cloud is data security. Keeping data safe is an issue in any computing environment, and it has been a focus since the earliest days of the cloud revolution. Understandably so: a lot can go wrong when you allow valuable information to live outside the firewall. Recent revelations about government snooping, along...
In his session at DevOps Summit, Tapabrata Pal, Director of Enterprise Architecture at Capital One, will tell a story about how Capital One has embraced Agile and DevOps Security practices across the Enterprise – driven by Enterprise Architecture; bringing in Development, Operations and Information Security organizations together. Capital Ones DevOpsSec practice is based upon three "pillars" – Shift-Left, Automate Everything, Dashboard Everything. Within about three years, from 100% waterfall, C...