Blog: Barry Devlin Subscribe to this blog's RSS feed!

Barry Devlin

As one of the founders of data warehousing back in the mid-1980s, a question I increasingly ask myself over 25 years later is: Are our prior architectural and design decisions still relevant in the light of today's business needs and technological advances? I'll pose this and related questions in this blog as I see industry announcements and changes in way businesses make decisions. I'd love to hear your answers and, indeed, questions in the same vein.

About the author >

Dr. Barry Devlin is among the foremost authorities in the world on business insight and data warehousing. He was responsible for the definition of IBM's data warehouse architecture in the mid '80s and authored the first paper on the topic in the IBM Systems Journal in 1988. He is a widely respected consultant and lecturer on this and related topics, and author of the comprehensive book Data Warehouse: From Architecture to Implementation.

Barry's interest today covers the wider field of a fully integrated business, covering informational, operational and collaborative environments and, in particular, how to present the end user with an holistic experience of the business through IT. These aims, and a growing conviction that the original data warehouse architecture struggles to meet modern business needs for near real-time business intelligence (BI) and support for big data, drove Barry’s latest book, Business unIntelligence: Insight and Innovation Beyond Analytics, now available in print and eBook editions.

Barry has worked in the IT industry for more than 30 years, mainly as a Distinguished Engineer for IBM in Dublin, Ireland. He is now founder and principal of 9sight Consulting, specializing in the human, organizational and IT implications and design of deep business insight solutions.

Editor's Note: Find more articles and resources in Barry's BeyeNETWORK Expert Channel and blog. Be sure to visit today!

Yesterday's Information Management webinar, "Information Overload? 3 ways real-time information is changing decision management" hosted by Eric Kavanaugh, got me into a philosophical mood.  Robin Bloor of the Bloor Group and David Olson of Progress Software provided a fascinating overview of the development of the complex event processing (CEP) market and its increasing importance for business competitiveness and, perhaps, even survival.

Robin's message was twofold.  From a business viewpoint, decision making is moving from reactive to predictive, driven by competition in business to best understand upcoming market opportunities right to the edge of what can be foreseen.  In technology, the exponential growth in speed of processors and, indeed, the majority of the hardware infrastructure is driving or enabling application architectures from batch orientation, through transaction processing and into real-time event handling.  David provided some interesting examples of how CEP is being used by his customers in manufacturing, logistics and airlines to monitor business events in real time, to react earlier to changing circumstances and to drive process improvement.  Their bottom line: CEP is a major architectural transition that is rapidly becoming mainstream; if you're not on board, you risk severe competitive disadvantage.

From one point of view, the message makes sense.  It's yet another twist of the screw towards speedier decision making.  Operational BI promotes the use of near real-time data either copied into the warehouse or accessed in transaction-processing systems via federated query.  CEP goes one step further and says let's access and analyze the data as it flows through the network; we need to make decisions before we land the data on a disk, if we even land it at all.  In the financial markets, with the data volumes and reaction speeds involved, once the technology became available the approach seemed like a no-brainer.  In financial systems, CEP enables high-value applications such as fraud detection in credit card transactions.

However, looking at some of the applications presented in the webinar, operational BI has also been used effectively to solve similar business issues.  The boundary between the more traditional operational BI approach and CEP depends on the required speed of decision making and the volumes of events involved.  CEP certainly extends the high-end of pattern recognition and trend detection to higher speeds and volumes.  And in the middle range, it provides another set of implementation options beside operational BI.

So, what were my philosophical musings?  Robin presented a very interesting scale of human decision-making timescales, from months and years at one extreme to one tenth of a second at the other.  That latter number is the fastest human reaction time and, by the way, slower than a cobra's strike!  CEP and, to some extent operational BI, operate in the range of decisions speeds faster than one tenth of a second: that is, entirely beneath human radar.  While it is clear that some decisions--collision avoidance on the highway, for example--naturally fall in this timescale, my concern is the implications that arise from pushing more and more decisions into this realm and, by definition, beyond human oversight.  We've already seen the consequences of this approach in the financial markets, where computer-based trading has driven wild, unpredictable and potentially dangerous swings in the markets.  Decision-making algorithms are only as good as the assumptions that have been encoded in them, which depend, in turn, on the knowledge available and the business requirements--both explicit and implicit--when they were created.  It is really sensible to design systems that unnecessarily exclude human wisdom?

The current business mindset that competitiveness is next to godliness is, in many cases, driving decision making into tighter and tighter circles, removing wisdom, insight, intuition and basic humanity from the loop.  Are we prepared to learn any lessons from the recent financial market fluctuations?  And, it is wise to arbitrarily remove more and more important decision making from human oversight just because the technology is available?  Just asking... 

Posted August 27, 2010 7:05 AM
Permalink | 4 Comments |

4 Comments

Thanks for your post. The overlap between operational BI and CEP is an interesting one. I think you've identified one of the most visible distinctions, that of speed and latency, but I'd like to expand on it. For many of our customers in the capital markets, StreamBase is first chosen because of it's ability to meet their performance goals. But the benefits of CEP go beyond that, which is why we are seeing CEP expand well beyond finance.

In addition to timely processing, CEP excels at rapid application development, and at data integration. When you're building applications that drive your core business, it's important to be able to change the application at the speed of your business. This means tool that support communication, collaboration, and the use of agile development methods.

StreamBase Studio, our integrated development environment, is what keeps customers coming back to StreamBase, even when ultra low latency and high throughput are not required. By enabling developers, quantitative analysts, and business analysts to use the same tools for application development, modification, testing, and deployment, we keep teams working closely together.

In addition to productivity and reduced iteration itmes, CEP is also strong when it comes to ad hoc data integration. Business Intelligence, whether it is operational or not, is generally based on the idea of pulling all the information together into a single gold master data warehouse. This kind of data management is great when it is possible. But often times firms have disparate systems that aren't nearly this well integrated.

Using CEP and Event Driven Architecture (EDA) firms can bring in data from a variety of sources, simply by instrumenting those sources to publish events, or listening to events that already exist on enterprise messaging systems or databases. Rather than having to first develop a data model, warehouse, and ETL process, CEP users can rapidly access the data they want, using it to drive decisions.

There is a lot of opportunity for CEP and Operational BI to work together, and CEP tools have a lot to offer beyond just performance. Rapid development and ease of data integration are just as important as reaction time. As StreamBase's customers are fond of noting, the latency that matters most is the time from ideal to implementation.

Richard Tibbetts
Chief Technology Officer
StreamBase Systems

Hi Richard,
Thanks for you comments and extensive description of StreamBase's strengths. It's not clear to me why, a priori, CEP should have more rapid development and ease of data integration characteristics than "traditional" development approaches. Perhaps that's a discussion we could take offline? I would be interested to see here your view on the business mindset that moves more and more important decision-making out of human control - as consultants and tool-providers, I feel we need to have an opinion about how technology is being used (or abused).

Barry.

Barry,

Praise for your post and the musing you put forward. And a very difficult question indeed. I think there are multiple dimensions to approach this issue. One would be the risk-dimension. Risk equates to chance times consequence. To keep with the example from the financial services industry: in retrospect, some kind of human oversight was definitely necessary.

I'd like to put forward another dimension. That is whether real-time decision making is necessary in the first place. Obviously, the answer to this questions depends on market, business context, type of decision etc. But all too often this kind of nuance is overlooked and the simple availability of the technology combined with a single customer success story is enough to create a bandwagon effect.

@Richard: you're response is way out of proportion here. Barry invites you to engage in a conceptual discussion; you just use it to promote your product. Trade shows and vendor presentations are used for that.

Wouter van Aerle

Hi Wouter,
Your comments are right to the point.

Risk is a very good starting point for discussion. Looking at the financial industry still, let's take a simpler example - credit-card fraud. (Near) real-time identification of fraudulent behaviour from event patterns clearly lowers the risk of losing more money - a good thing, I'm pretty sure! There is, of course, another side of the risk - that of mis-identifying a pattern as fraud and denying the real card owner credit (and stranding her in Timbuktu with no money!) Clearly, there is a balance of risks there, and probably most people would argue that the balance is in favour of the (near) real-time, automated decision making here.

I'm much more unsure about the balance in automated trading, for example, and would love to hear views (both pro and con) on that. And maybe examples from other industries...

Thanks for your feedback!
Barry.

Leave a comment