SYS-CON MEDIA Authors: Andy Thurai, Liz McMillan, Kevin Benedict, Zakia Bouachraoui, Elizabeth White

Blog Feed Post

What Exactly is Complex Event Processing Today?

As much as I disagree with much of what Curt Monash writes, he did actually ask a good question recently in his post, “Renaming CEP… or not

Without getting into a rehash of the hash over there, let’s look at things a bit differently.  Let’s talk about what CEP is not.

I left trading to join a firm called NEON.  I was an early investor in this company, my mentor had started the firm, and to me it looked like the cat’s meow.  It was a great time, a lot of people made a ton of money, and I was introduced into the world of software via Enterprise Application Integration (EAI).  Using EAI, one could centralize the business logic associated with plugging different systems into each other, convert the format of one system into another, and pass messages around seamlessly between these applications.  There weren’t a lot of competitors at that time, we were arguable #1 in the space, and the technology became MQ Series Integrator (think Websphere, like I said, we made a lot of money.)

Well, CEP isn’t EAI because there’s no concept of format libraries – sure CEP engines use input/output adapters but sure does every program ever written (I’m waiting for the first salesperson to licence the keyboard/screen adapter set – available in different languages soon!).  We’re going to come back to EAI in a moment.

Throughout my career, the teams I’ve worked with have used a variety of 4th generation languages.  Stuff like Powerbuilder, SQL Windows, Paradox, etc.  Each of those environments had some common elements, screen designers, a domain specific language designed to make bizapp dev faster, and abstractions for common data stores.  Often times, our groups wrote servers that integrated with these front ends via RPC, a streaming connection, or databases.

CEP isn’t a 4th generation bizapp dev environment – there’s no facility for building gui’s.  Although some of the CEP platforms out there do have DSL’s, some also use SQL derivations.  I’ve used the SQL derivations (I’ve worked at two of those co’s) and guess what?  The people in those firms hated using the language themselves.  ”Yes, you could do a covariance matrix with <insert proprietary get-me-sued-for-naming-it-here> but I could do it faster and easier in a different language.

I’ve also used many databases.  But you don’t use CEP to store data – you only process the data in flight.

So, CEP isn’t EAI, it’s not a database, and it’s not an application development environment.  Where, then, did CEP come from?  Let’s look at a couple.

The work out of Berkeley and the work out of Brown, Brandeis and MIT focused on event stream processing.  Here’s a blurb about Berkeley’s Telegraph:

Telegraph is an adaptive data-flow system, which allows individuals and institutions to access, combine, analyze, and otherwise benefit from this data wherever it resides.  As a data-flow system, Telegraph can tap into pooled data stored on the network, and harness streams of live data coming out of networked sensors, software, and smart devices.  In order to operate robustly in this volatile, inter-networked world, Telegraph is adaptive – it uses new data-flow technologies to route unpredictable and bursty data-flows through computing resources on a network, resulting in manageable streams of useful information.

And here’s one about Aurora (Brown, Brandeis, & MIT):

Aurora addresses three broad application types in a single, unique framework:

  1. Real-time monitoring applications continuously monitor the present state of the world and are, thus, interested in the most current data as it arrives from the environment. In these applications, there is little or no need (or time) to store such data.
  2. Archival applications are typically interested in the past. They are primarily concerned with processing large amounts of finite data stored in a time-series repository.
  3. Spanning applications involve both the present and past states of the world, requiring combining and comparing incoming live data and stored historical data. These applications are the most demanding as there is a need to balance real-time requirements with efficient processing of large amounts of disk-resident data.

Hmm.  I’ve worked with both of those packages – no mention of Complex Event Processing in there at all.  So where did that phrase even come from?  Well, that’s the title of David Luckham’s book, “The Power of the Event” in which the good professor describes not so much an implementation, but a set of processes designed to help us all run our businesses and missions more effectively.  In the book though, David references a language that deals with streaming data.  Oh oh….

Around 2005-2006, a couple of firms were struggling trying to describe what Event Stream Processing was and why it was important and more importantly, why you should be spending money on it.  I was the CTO of one of those firms.  We competed mostly against Streambase at the time.  Somewhere during that time frame, the phrase Complex Event Processing was adapted in an effort to differentiate.  At that time, Aleri wasn’t CEP – they were OLAP.  Streambase, formerly Grassy Brook, probably choose that name in homage to Stream Processing.  Kaskad is Swedish for waterfall, or where a bunch of rivers and/or streams collide.  I don’t think Apama ever used the phrase ESP, they were focused on trading from the start.  Starting to get the picture?

So, if CEP isn’t EAI, and it’s not a 4th generation bizapp tool, what is it?  I’ve probably kicked this dead horse enough, but one more time and it’s not going to notice.  CEP needs 4 things to be called CEP (or ESP…):

  1. Domain Specific Language
  2. Continuous Query
  3. Time or Length Windows
  4. Temporal Pattern Matching

These 4 things, in my opinion, don’t make up a separate space, let alone a market.  What they describe is Event Stream Processing.  What they describe are features found in larger, more complete event processing environments from IBM, SAP, TIBCO, and Progress.  And TIBCO, for example, just added the missing features described above to their Business Events Platform, and had instant CEP (sarcasm mine).  Those offerings look a lot like the traditional EAI platforms – or where all of this began roughly 20 years ago.

So, I don’t think what a couple of vendors sell as Complex Event Processing is really CEP at all.  If you want an idea of what CEP is really all about, read David’s book to get started.  Then take a look at Tim Bass’s blog thecepblog.com.

In my next post, I’ll describe what CEP means to me and talk about some of the current offerings in that space.  But for now, let’s just drop the phrase CEP (because it’s mostly just Stream Processing) because it means so little to so many and fails to impart any meaningful message to the people who actually write checks for this stuff.

Thanks for reading!

Read the original blog entry...

More Stories By Colin Clark

Colin Clark is the CTO for Cloud Event Processing, Inc. and is widely regarded as a thought leader and pioneer in both Complex Event Processing and its application within Capital Markets.

Follow Colin on Twitter at http:\\twitter.com\EventCloudPro to learn more about cloud based event processing using map/reduce, complex event processing, and event driven pattern matching agents. You can also send topic suggestions or questions to [email protected]

Latest Stories
In today's always-on world, customer expectations have changed. Competitive differentiation is delivered through rapid software innovations, the ability to respond to issues quickly and by releasing high-quality code with minimal interruptions. DevOps isn't some far off goal; it's methodologies and practices are a response to this demand. The demand to go faster. The demand for more uptime. The demand to innovate. In this keynote, we will cover the Nutanix Developer Stack. Built from the foundat...
Cognitive Computing is becoming the foundation for a new generation of solutions that have the potential to transform business. Unlike traditional approaches to building solutions, a cognitive computing approach allows the data to help determine the way applications are designed. This contrasts with conventional software development that begins with defining logic based on the current way a business operates. In her session at 18th Cloud Expo, Judith S. Hurwitz, President and CEO of Hurwitz & ...
The Jevons Paradox suggests that when technological advances increase efficiency of a resource, it results in an overall increase in consumption. Writing on the increased use of coal as a result of technological improvements, 19th-century economist William Stanley Jevons found that these improvements led to the development of new ways to utilize coal. In his session at 19th Cloud Expo, Mark Thiele, Chief Strategy Officer for Apcera, compared the Jevons Paradox to modern-day enterprise IT, examin...
"Cloud computing is certainly changing how people consume storage, how they use it, and what they use it for. It's also making people rethink how they architect their environment," stated Brad Winett, Senior Technologist for DDN Storage, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
"NetApp's vision is how we help organizations manage data - delivering the right data in the right place, in the right time, to the people who need it, and doing it agnostic to what the platform is," explained Josh Atwell, Developer Advocate for NetApp, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Sold by Nutanix, Nutanix Mine with Veeam can be deployed in minutes and simplifies the full lifecycle of data backup operations, including on-going management, scaling and troubleshooting. The offering combines highly-efficient storage working in concert with Veeam Backup and Replication, helping customers achieve comprehensive data protection for all their workloads — virtual, physical and private cloud —to meet increasing business demands for uptime and productivity.
The Software Defined Data Center (SDDC), which enables organizations to seamlessly run in a hybrid cloud model (public + private cloud), is here to stay. IDC estimates that the software-defined networking market will be valued at $3.7 billion by 2016. Security is a key component and benefit of the SDDC, and offers an opportunity to build security 'from the ground up' and weave it into the environment from day one. In his session at 16th Cloud Expo, Reuven Harrison, CTO and Co-Founder of Tufin, ...
While the focus and objectives of IoT initiatives are many and diverse, they all share a few common attributes, and one of those is the network. Commonly, that network includes the Internet, over which there isn't any real control for performance and availability. Or is there? The current state of the art for Big Data analytics, as applied to network telemetry, offers new opportunities for improving and assuring operational integrity. In his session at @ThingsExpo, Jim Frey, Vice President of S...
Rodrigo Coutinho is part of OutSystems' founders' team and currently the Head of Product Design. He provides a cross-functional role where he supports Product Management in defining the positioning and direction of the Agile Platform, while at the same time promoting model-based development and new techniques to deliver applications in the cloud.
In his keynote at 18th Cloud Expo, Andrew Keys, Co-Founder of ConsenSys Enterprise, provided an overview of the evolution of the Internet and the Database and the future of their combination – the Blockchain. Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life settl...
"We were founded in 2003 and the way we were founded was about good backup and good disaster recovery for our clients, and for the last 20 years we've been pretty consistent with that," noted Marc Malafronte, Territory Manager at StorageCraft, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
DevOps is often described as a combination of technology and culture. Without both, DevOps isn't complete. However, applying the culture to outdated technology is a recipe for disaster; as response times grow and connections between teams are delayed by technology, the culture will die. A Nutanix Enterprise Cloud has many benefits that provide the needed base for a true DevOps paradigm. In their Day 3 Keynote at 20th Cloud Expo, Chris Brown, a Solutions Marketing Manager at Nutanix, and Mark Lav...
@CloudEXPO and @ExpoDX, two of the most influential technology events in the world, have hosted hundreds of sponsors and exhibitors since our launch 10 years ago. @CloudEXPO and @ExpoDX New York and Silicon Valley provide a full year of face-to-face marketing opportunities for your company. Each sponsorship and exhibit package comes with pre and post-show marketing programs. By sponsoring and exhibiting in New York and Silicon Valley, you reach a full complement of decision makers and buyers in ...
There are many examples of disruption in consumer space – Uber disrupting the cab industry, Airbnb disrupting the hospitality industry and so on; but have you wondered who is disrupting support and operations? AISERA helps make businesses and customers successful by offering consumer-like user experience for support and operations. We have built the world’s first AI-driven IT / HR / Cloud / Customer Support and Operations solution.
Historically, some banking activities such as trading have been relying heavily on analytics and cutting edge algorithmic tools. The coming of age of powerful data analytics solutions combined with the development of intelligent algorithms have created new opportunities for financial institutions. In his session at 20th Cloud Expo, Sebastien Meunier, Head of Digital for North America at Chappuis Halder & Co., discussed how these tools can be leveraged to develop a lasting competitive advantage ...