Click here to close now.

SYS-CON MEDIA Authors: Elizabeth White, Cloud Best Practices Network, Kevin Jackson, Pat Romanski, XebiaLabs Blog

Article

Towards Next Generation Enterprise IT

How Will Enterprise IT Look Ten Years From Now ?

Data processing power is likely to continue growing. Are contemporary IT development methods, processes and procurement practices properly positioned to take advantage of increasing capabilities?

CPU/Memory/Storage can today be provisioned in a few clicks. Limitations presented by processing power and physical infrastructure will continue to be of less importance, as was the case in the past.

We are gradually coming close to a situation where constraints determining present corporate IT standards are not an issue any more. For example, it is current practice that OLTP and analytic databases are on separate servers and one of major drivers for such separation is performance. New generations of in-memory databases and powerful servers all but obviates the need for such distinction.

We are already at the stage where all we have in front of us is a blank sheet of paper ( i.e. pool of computing resources ) that can be filled with previously unheard of quantities of data and processed with little regard to performance limitations. Do we already have,  or we need new modeling, programming, best practices paradigms  to take advantage of this new potential, similarly to how relational theory, databases and ER modeling superseded file and program design techniques of 60's and 70's ?

Today's IT systems are very fragmented - we have data islands, application islands, difficulties in providing unified data views, master data management problem. Applications are purchased as ERP systems, built in-house, on various platforms. Some attempts to standardize various components of IT are present - data, infrastructure standards, but, by and large, corporate IT map ( data, processing, infrastructure ) looks more like spaghetti then well designed, unified system. Some vendors recognized the need to offer integrated, engineered systems. Oracle, for example, is attempting to offer full stack - from hardware and all the way up to analytic packages.

Zachman, TOGAF and other frameworks are attempting to offer holistic view and method to enterprise computing architecture. We have UML, ER modeling, system analysis techniques, Agile, Scrum methodologies, CASE tools, in case you want to build systems yourself.

Then there is the opportunity to skip all of this and simply subscribe to SaaS application like Salesforce.

If history is any guide, we will have forces driving towards continuing fragmentation. What we have seen in last 50 years is diversification from just a few vendors, to myriad of products, approaches and methods. There is also counter trend towards consolidation, mergers and acquisitions. Another trend is growing volumes of data, as more and more of contemporary devices generate data which can be used for customer analysis behavior etc.

Reasons of economy, efficiency, centralization and simplification will perhaps give consolidation forces the upper hand. Powerful, mega-size pre-built systems will be deployed by mega-vendors. We have seen this trend in OLTP systems already, where major vendors like SAP and Oracle provide mainstream corporate OLTP systems and extend into BI packages and applications. This, and SaaS will diminish the need for in-house development , at least for run-of-the-mill applications. Growing computing power will enable vendors to deliver applications of increasing levels of integration, scope and complexity, to be used as larger building blocks and new cornerstones of corporate IT architecture.

More Stories By Ranko Mosic

Ranko Mosic, BScEng, is specializing in Big Data/Data Architecture consulting services ( database/data architecture, machine learning ). His clients are in finance, retail, telecommunications industries. Ranko is welcoming inquiries about his availability for consulting engagements and can be reached at 408-757-0053 or [email protected]

Latest Stories
Data-intensive companies that strive to gain insights from data using Big Data analytics tools can gain tremendous competitive advantage by deploying data-centric storage. Organizations generate large volumes of data, the vast majority of which is unstructured. As the volume and velocity of this unstructured data increases, the costs, risks and usability challenges associated with managing the unstructured data (regardless of file type, size or device) increases simultaneously, including end-to-...
Sensor-enabled things are becoming more commonplace, precursors to a larger and more complex framework that most consider the ultimate promise of the IoT: things connecting, interacting, sharing, storing, and over time perhaps learning and predicting based on habits, behaviors, location, preferences, purchases and more. In his session at @ThingsExpo, Tom Wesselman, Director of Communications Ecosystem Architecture at Plantronics, will examine the still nascent IoT as it is coalescing, includin...
The cloud has transformed how we think about software quality. Instead of preventing failures, we must focus on automatic recovery from failure. In other words, resilience trumps traditional quality measures. Continuous delivery models further squeeze traditional notions of quality. Remember the venerable project management Iron Triangle? Among time, scope, and cost, you can only fix two or quality will suffer. Only in today's DevOps world, continuous testing, integration, and deployment upend...
The excitement around the possibilities enabled by Big Data is being tempered by the daunting task of feeding the analytics engines with high quality data on a continuous basis. As the once distinct fields of data integration and data management increasingly converge, cloud-based data solutions providers have emerged that can buffer your organization from the complexities of this continuous data cleansing and management so that you’re free to focus on the end goal: actionable insight.
DevOps tends to focus on the relationship between Dev and Ops, putting an emphasis on the ops and application infrastructure. But that’s changing with microservices architectures. In her session at DevOps Summit, Lori MacVittie, Evangelist for F5 Networks, will focus on how microservices are changing the underlying architectures needed to scale, secure and deliver applications based on highly distributed (micro) services and why that means an expansion into “the network” for DevOps.
With several hundred implementations of IoT-enabled solutions in the past 12 months alone, this session will focus on experience over the art of the possible. Many can only imagine the most advanced telematics platform ever deployed, supporting millions of customers, producing tens of thousands events or GBs per trip, and hundreds of TBs per month. With the ability to support a billion sensor events per second, over 30PB of warm data for analytics, and hundreds of PBs for an data analytics arc...
We’re no longer looking to the future for the IoT wave. It’s no longer a distant dream but a reality that has arrived. It’s now time to make sure the industry is in alignment to meet the IoT growing pains – cooperate and collaborate as well as innovate. In his session at @ThingsExpo, Jim Hunter, Chief Scientist & Technology Evangelist at Greenwave Systems, will examine the key ingredients to IoT success and identify solutions to challenges the industry is facing. The deep industry expertise be...
Between the compelling mockups and specs produced by your analysts and designers, and the resulting application built by your developers, there is a gulf where projects fail, costs spiral out of control, and applications fall short of requirements. In his session at DevOps Summit, Charles Kendrick, CTO and Chief Architect at Isomorphic Software, will present a new approach where business and development users collaborate – each using tools appropriate to their goals and expertise – to build mo...
Operational Hadoop and the Lambda Architecture for Streaming Data Apache Hadoop is emerging as a distributed platform for handling large and fast incoming streams of data. Predictive maintenance, supply chain optimization, and Internet-of-Things analysis are examples where Hadoop provides the scalable storage, processing, and analytics platform to gain meaningful insights from granular data that is typically only valuable from a large-scale, aggregate view. One architecture useful for capturing...
The Internet of Things (IoT) is causing data centers to become radically decentralized and atomized within a new paradigm known as “fog computing.” To support IoT applications, such as connected cars and smart grids, data centers' core functions will be decentralized out to the network's edges and endpoints (aka “fogs”). As this trend takes hold, Big Data analytics platforms will focus on high-volume log analysis (aka “logs”) and rely heavily on cognitive-computing algorithms (aka “cogs”) to mak...
When it comes to the Internet of Things, hooking up will get you only so far. If you want customers to commit, you need to go beyond simply connecting products. You need to use the devices themselves to transform how you engage with every customer and how you manage the entire product lifecycle. In his session at @ThingsExpo, Sean Lorenz, Technical Product Manager for Xively at LogMeIn, will show how “product relationship management” can help you leverage your connected devices and the data th...
The Internet of Everything (IoE) brings together people, process, data and things to make networked connections more relevant and valuable than ever before – transforming information into knowledge and knowledge into wisdom. IoE creates new capabilities, richer experiences, and unprecedented opportunities to improve business and government operations, decision making and mission support capabilities. In his session at @ThingsExpo, Gary Hall, Chief Technology Officer, Federal Defense at Cisco S...
For years, we’ve relied too heavily on individual network functions or simplistic cloud controllers. However, they are no longer enough for today’s modern cloud data center. Businesses need a comprehensive platform architecture in order to deliver a complete networking suite for IoT environment based on OpenStack. In his session at @ThingsExpo, Dhiraj Sehgal from PLUMgrid will discuss what a holistic networking solution should really entail, and how to build a complete platform that is scalable...
In the consumer IoT, everything is new, and the IT world of bits and bytes holds sway. But industrial and commercial realms encompass operational technology (OT) that has been around for 25 or 50 years. This grittier, pre-IP, more hands-on world has much to gain from Industrial IoT (IIoT) applications and principles. But adding sensors and wireless connectivity won’t work in environments that demand unwavering reliability and performance. In his session at @ThingsExpo, Ron Sege, CEO of Echelon...
Big Data is amazing, it's life changing and yes it is changing how we see our world. Big Data, however, can sometimes be too big. Organizations that are not amassing massive amounts of information and feeding into their decision buckets, smaller data that feeds in from customer buying patterns, buying decisions and buying influences can be more useful when used in the right way. In their session at Big Data Expo, Ermanno Bonifazi, CEO & Founder of Solgenia, and Ian Khan, Global Strategic Positi...