SYS-CON MEDIA Authors: Carmen Gonzalez, Elizabeth White, Doug Masi, Mat Mathews, PR.com Newswire

Article

Towards Next Generation Enterprise IT

How Will Enterprise IT Look Ten Years From Now ?

Data processing power is likely to continue growing. Are contemporary IT development methods, processes and procurement practices properly positioned to take advantage of increasing capabilities?

CPU/Memory/Storage can today be provisioned in a few clicks. Limitations presented by processing power and physical infrastructure will continue to be of less importance, as was the case in the past.

We are gradually coming close to a situation where constraints determining present corporate IT standards are not an issue any more. For example, it is current practice that OLTP and analytic databases are on separate servers and one of major drivers for such separation is performance. New generations of in-memory databases and powerful servers all but obviates the need for such distinction.

We are already at the stage where all we have in front of us is a blank sheet of paper ( i.e. pool of computing resources ) that can be filled with previously unheard of quantities of data and processed with little regard to performance limitations. Do we already have,  or we need new modeling, programming, best practices paradigms  to take advantage of this new potential, similarly to how relational theory, databases and ER modeling superseded file and program design techniques of 60's and 70's ?

Today's IT systems are very fragmented - we have data islands, application islands, difficulties in providing unified data views, master data management problem. Applications are purchased as ERP systems, built in-house, on various platforms. Some attempts to standardize various components of IT are present - data, infrastructure standards, but, by and large, corporate IT map ( data, processing, infrastructure ) looks more like spaghetti then well designed, unified system. Some vendors recognized the need to offer integrated, engineered systems. Oracle, for example, is attempting to offer full stack - from hardware and all the way up to analytic packages.

Zachman, TOGAF and other frameworks are attempting to offer holistic view and method to enterprise computing architecture. We have UML, ER modeling, system analysis techniques, Agile, Scrum methodologies, CASE tools, in case you want to build systems yourself.

Then there is the opportunity to skip all of this and simply subscribe to SaaS application like Salesforce.

If history is any guide, we will have forces driving towards continuing fragmentation. What we have seen in last 50 years is diversification from just a few vendors, to myriad of products, approaches and methods. There is also counter trend towards consolidation, mergers and acquisitions. Another trend is growing volumes of data, as more and more of contemporary devices generate data which can be used for customer analysis behavior etc.

Reasons of economy, efficiency, centralization and simplification will perhaps give consolidation forces the upper hand. Powerful, mega-size pre-built systems will be deployed by mega-vendors. We have seen this trend in OLTP systems already, where major vendors like SAP and Oracle provide mainstream corporate OLTP systems and extend into BI packages and applications. This, and SaaS will diminish the need for in-house development , at least for run-of-the-mill applications. Growing computing power will enable vendors to deliver applications of increasing levels of integration, scope and complexity, to be used as larger building blocks and new cornerstones of corporate IT architecture.

More Stories By Ranko Mosic

Ranko Mosic, BScEng, is specializing in Big Data/Data Architecture consulting services ( database/data architecture, machine learning ). His clients are in finance, retail, telecommunications industries. Ranko is welcoming inquiries about his availability for consulting engagements and can be reached at 408-757-0053 or [email protected]