A Smarter Planet
You've heard it before but it warrants repeating: Every natural and man-made system is becoming instrumented, interconnected and intelligent. We now have the ability to measure, sense and monitor the condition of almost everything. Furthermore, people, systems and objects can communicate and interact with each other in entirely new ways. This means that we can respond to changes quickly and get better results by predicting and optimising for future events.
That's the good news but there are challenges.
Today, more than ever, organisations are under pressure to leverage a wealth of information to make more intelligent choices.
Volume of data: Data volumes are expected to grow tenfold in the next three years. This is driven by the proliferation of end-user devices, real-world sensors, lower-cost technologies and population growth.
Velocity of decision-making: The market demands that businesses optimise decisions, take action based on good information and utilise advanced predictive capabilities - all with speed and efficiency.
Variety of information: With the expansion of information comes large variances in the complexion of available data - very noisy with lots of errors and no opportunity to cleanse it in a world of real-time decision-making.
Shift in what we analyse: Enterprises need a broader, systems-based approach to the information they examine and optimise. Stream computing and event processing capabilities are enabling the analysis of massive volumes.
Event Processing
Event processing provides us with an approach to leverage this wealth of information for competitive advantage and for the betterment of people
An Event: Any signal indicating that a change of state (of the business, real-world or other domain) has occurred or contemplated as having occurred.
Simple Event Processing (SEP): Simple event processing is not a new concept. SEP provides functions to detect and respond reactively to a single source, or homogeneous event type.
Complex Event Processing (CEP): Detect and respond reactively to patterns among like or related events, missing events and aggregate events. CEP supports high volumes of homogenous events within a predictable time and order.
Business Event Processing (BEP): Extends CEP and provides a graphical, non-programmatic user interface that allows business users to manage event processing logic themselves. Supports high volumes of heterogenous business events and complex patterns that occur in no particular time or order.
Event processing is complementary to SOA
SOA is an approach to modelling your business, and hence IT, as a set of reusable activities or services. Services are then orchestrated or composed into coarser services, processes or application. SOA communication is typically (or perhaps narrowly understood to be) implemented as request/response pattern.
Event processing adds an asynchronous pattern to SOA components and services. These patterns are complementary. Services can emit events and consume events. The event processing itself is performed as an extended capability of an ESB.
Conclusion
Event processing, and especially business event processing, begins to deliver on the "Utopian ideals" of real-time analytics and business-configured processing.
This is big ... and it going to get a lot bigger.
2 comments:
As a CTO at IBM, it would be interesting to hear your thoughts on where event processing will be used at IBM other than in their commercials :) As the CEO of StreamBase, the MIT-founded company regarded widely as the leader in event processing (by the World Economic Forum, MIT Tech review, Time Magazine, etc.), we have been happy to invite and compete with IBM in the event processing space. However, IBM's offerings haven't made much commercial head way. We have seen them, for example, in some government projects packaged along with massive services engagements, but those applications tend to be 98% custom with a little CEP.
What do you see as the key applications in your part of the world that will be CEP powered?
- Mark Palmer, CEO,
StreamBase
Thanks. Unfortunately I can not respond officially to the statements made in your first paragraph but what I can say that from my perspective technologies come and go - and as such I am primarily interested in the patterns and how we employ them to solve real-world problems.
CEP/BEP is gaining traction with my clients as there is an emerging understanding that it is an approach that that enables exception processing at a lower cost. By this I mean that whilst it is efficient to orchestrate the known/static but less so for the exceptions.
A client once said to me that it didn't make financial sense to codify every eventuality. It was cheaper for him to employ people to manually deal with the exceptions as long as he had integrity and audit-ability in the system.
CEP/BEP provides an answer to this in addition to other well known reasons for adopting the pattern (for example responding to rapidly changing situations). As with all systems it comes down to granularity/ decomposition as eventually there are some situations that are better suited to orchestration and other situations that are better handled through the detection of an event (or pattern) and the subsequent response to that event. "Right tool for the job" comes to mind.
Ultimately I hope that CEP/BEP will provide people within my region with the ability to respond quickly to some of the big problems facing us. Health-care comes to mind - both from the management, and allocation, of scare resources (medicine, doctors) and the response to medical emergencies.
Post a Comment