|By Michael Bushong||
|January 14, 2014 08:27 PM EST||
I wrote previously that the networking industry was evolving from CapEx to OpEx to AppEx (Application Experience). There is certainly enough market buzz around applications if you listen to the major vendor positions. Cisco includes applications in its overarching moniker (Application-Centric Infrastructure), VMWare has blogged about application awareness as it relates to NSX, and even some of the peripheral players like F5 have embraced applications as part of their ongoing dialogue with the community.
If there is a shift to AppEx, what are the implications for the CIO?
The most obvious requirement to move to Application Experience as an IT driver is to define in explicit terms what application experience means. We need to define terms like performance and scale and even secure, and then break down the various contributions by system component so that we can understand who is responsible for what impact.
But a movement towards explicitly-defined application experience means a lot more than just instrumenting some infrastructure, collection statistics, and correlating the data.
What would have to be true for application experience to be a major driving factor behind architectural decisions? At its most basic, there would have to be widespread agreement on what the meaningful applications in the company are. Certainly you cannot create blanket application experience metrics that are applied uniformly to every application. This means that CIOs who want to prepare for a move in this direction could start by cataloguing the applications in play today.
Any such inventory should explicitly document how applications contribute to meaningful corporate objectives. For high-tech companies with a distributed development force, the applications might hinge around code versioning, bug tracking, and compilation tools. For companies that deal with tons of human resources, the most important applications might be HCM or even ERP. For companies whose job it is to maintain data, the applications could be more related to storage and replication.
Whatever the applications, the CIO ought to know those that are most critical to the business.
Note that the focus is on what is most important. There is not a real need to understand every application. Optimization is about treating some things differently. If you inventory 4,000 applications and try to make them all somewhat different, the deltas between individual applications become irrelevantly small. Instead, application experience will dictate that you manage by exception – identify the really critical or performance-sensitive stuff and do something special there.
For most enterprises, IT is treated as a service organization. If this is the case, the CIO will be expected to align application experience to the various lines of business. Not only will they have an interest in what the most critical applications are but also in what metrics are being used to define success. After all, it is their employees that are the consumers of that experience. This means that CIOs should include the lines of business in the definition of application experience.
But once you define the objective, you will be expected to report progress against it. It seems likely that these metrics would eventually become performance indicators for specific groups or IT as a whole. The implication here is that the metrics will help set objectives, which means that they will influence things like bonuses and budgets.
Leaders need to understand the likely endgame. The temptation when creating metrics in many organizations is to quickly pull together metrics that are good enough. But if you know ahead of time that those metrics will eventually drive how the organization is compensated, perhaps you ought to spend more time up front getting them right. And before setting targets, you likely want to spend real time benchmarking performance in a repeatable way.
Repeatable is the key here. Anyone who has instrumented ANY system will attest to the fact that metrics are only useful if they are repeatable. If running a report yields different results every time you run it, chances are that the report is not as meaningful as you would like. The ramification is that reports need to be run around well-defined scenarios that can be reproduced on demand.
The upside of all of this preparation is that the right set of metrics can be powerful change agents. They help focus entire organizations on a small set of tasks that can have demonstrable impact on the business.
The point here is that there is a lot of work that has to happen on the customer side before something like Application Experience becomes real. While it is incumbent on the vendors to create solutions that do something better for applications, customers should eventually be complicit in any shift in purchasing criteria. Those customers that start early will be in the best position to lead the dialogue with vendors.
And leading will matter because the efficacy of all of this will eventually rest on the existence of a solid analytics foundation. It is possible that clever customers can steer their vendors to work with specific analytics companies. That would give them a tangible deployment advantage, both in terms of acquisition costs (the solution is already on-prem) and operational effort.
For vendors, this means you ought to be looking around now to see who you will partner with. Choose wisely, because if the industry does go through consolidation and your dance partner is gobbled up, you might be left alone. The stakes might not be super high now, but when purchasing decisions hinge on a measurable Application Experience, you might think differently.
[Today's fun fact: In the average lifetime, a person will walk the equivalent of 5 times around the equator. You would think we would all be thinner.]
The post Long-term CIO implications of application experience appeared first on Plexxi.
Since 2008 and for the first time in history, more than half of humans live in urban areas, urging cities to become “smart.” Today, cities can leverage the wide availability of smartphones combined with new technologies such as Beacons or NFC to connect their urban furniture and environment to create citizen-first services that improve transportation, way-finding and information delivery. In her session at @ThingsExpo, Laetitia Gazel-Anthoine, CEO of Connecthings, will focus on successful use c...
May. 22, 2015 06:00 AM EDT Reads: 4,631
The recent trends like cloud computing, social, mobile and Internet of Things are forcing enterprises to modernize in order to compete in the competitive globalized markets. However, enterprises are approaching newer technologies with a more silo-ed way, gaining only sub optimal benefits. The Modern Enterprise model is presented as a newer way to think of enterprise IT, which takes a more holistic approach to embracing modern technologies.
May. 22, 2015 06:00 AM EDT Reads: 5,785
DevOps Summit, taking place Nov 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 17th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long developmen...
May. 22, 2015 06:00 AM EDT Reads: 1,991
Security can create serious friction for DevOps processes. We've come up with an approach to alleviate the friction and provide security value to DevOps teams. In her session at DevOps Summit, Shannon Lietz, Senior Manager of DevSecOps at Intuit, will discuss how DevSecOps got started and how it has evolved. Shannon Lietz has over two decades of experience pursuing next generation security solutions. She is currently the DevSecOps Leader for Intuit where she is responsible for setting and driv...
May. 22, 2015 06:00 AM EDT Reads: 3,679
The explosion of connected devices / sensors is creating an ever-expanding set of new and valuable data. In parallel the emerging capability of Big Data technologies to store, access, analyze, and react to this data is producing changes in business models under the umbrella of the Internet of Things (IoT). In particular within the Insurance industry, IoT appears positioned to enable deep changes by altering relationships between insurers, distributors, and the insured. In his session at @Things...
May. 22, 2015 06:00 AM EDT Reads: 4,447
“Oh, dev is dev and ops is ops, and never the twain shall meet.” With apoloies to Rudyard Kipling and all of his fans, this describes the early state of the two sides of DevOps. Yet the DevOps approach is demanded by cloud computing, as the speed, flexibility, and scalability in today's so-called “Third Platform” must not be hindered by the traditional limitations of software development and deployment. A recent report by Gartner, for example, says that 25% of Global 2000 companies will b...
May. 22, 2015 05:45 AM EDT Reads: 2,591
Software-driven innovation is becoming a primary approach to how businesses create and deliver new value to customers. A survey of 400 business and IT executives by the IBM Institute for Business Value showed businesses that are more effective at software delivery are also more profitable than their peers nearly 70 percent of the time (1). DevOps provides a way for businesses to remain competitive, applying lean and agile principles to software development to speed the delivery of software that ...
May. 22, 2015 05:45 AM EDT Reads: 6,243
Big Data is amazing, it's life changing and yes it is changing how we see our world. Big Data, however, can sometimes be too big. Organizations that are not amassing massive amounts of information and feeding into their decision buckets, smaller data that feeds in from customer buying patterns, buying decisions and buying influences can be more useful when used in the right way. In their session at Big Data Expo, Ermanno Bonifazi, CEO & Founder of Solgenia, and Ian Khan, Global Strategic Positi...
May. 22, 2015 05:30 AM EDT Reads: 2,174
JFrog on Thursday announced that it has added Docker support to Bintray, its distribution-as-a-service (DaaS) platform. When combined with JFrog’s Artifactory binary repository management system, organizations can now manage Docker images with an end-to-end solution that supports all technologies. The new version of Bintray allows organizations to create an unlimited number of private Docker repositories, and through the use of fast Akamai content delivery networks (CDNs), it decreases the dow...
May. 22, 2015 05:30 AM EDT Reads: 3,107
More organizations are embracing DevOps to realize compelling business benefits such as more frequent feature releases, increased application stability, and more productive resource utilization. However, security and compliance monitoring tools have not kept up and often represent the single largest remaining hurdle to continuous delivery. In their session at DevOps Summit, Justin Criswell, Senior Sales Engineer at Alert Logic, Ricardo Lupo, a Solution Architect with Chef, will discuss how to ...
May. 22, 2015 05:15 AM EDT Reads: 2,866
Over the last few years the healthcare ecosystem has revolved around innovations in Electronic Health Record (HER) based systems. This evolution has helped us achieve much desired interoperability. Now the focus is shifting to other equally important aspects - scalability and performance. While applying cloud computing environments to the EHR systems, a special consideration needs to be given to the cloud enablement of Veterans Health Information Systems and Technology Architecture (VistA), i.e....
May. 22, 2015 05:00 AM EDT Reads: 3,859
17th Cloud Expo, taking place Nov 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy. Meanwhile, 94% of enterprises a...
May. 22, 2015 05:00 AM EDT Reads: 2,057
One of the biggest impacts of the Internet of Things is and will continue to be on data; specifically data volume, management and usage. Companies are scrambling to adapt to this new and unpredictable data reality with legacy infrastructure that cannot handle the speed and volume of data. In his session at @ThingsExpo, Don DeLoach, CEO and president of Infobright, will discuss how companies need to rethink their data infrastructure to participate in the IoT, including: Data storage: Understand...
May. 22, 2015 05:00 AM EDT Reads: 4,188
Thanks to Docker, it becomes very easy to leverage containers to build, ship, and run any Linux application on any kind of infrastructure. Docker is particularly helpful for microservice architectures because their successful implementation relies on a fast, efficient deployment mechanism – which is precisely one of the features of Docker. Microservice architectures are therefore becoming more popular, and are increasingly seen as an interesting option even for smaller projects, instead of bein...
May. 22, 2015 05:00 AM EDT Reads: 4,163
The truth is, today’s databases are anything but agile – they are effectively static repositories that are cumbersome to work with, difficult to change, and cannot keep pace with application demands. Performance suffers as a result, and it takes far longer than it should to deliver new features and capabilities needed to make your organization competitive. As your application and business needs change, data repositories and structures get outmoded rapidly, resulting in increased work for applica...
May. 22, 2015 04:30 AM EDT Reads: 2,515