We use cookies and other similar technologies (Cookies) to enhance your experience and to provide you with relevant content and ads. By using our website, you are agreeing to the use of Cookies. You can change your settings at any time. Cookie Policy.


Blog: Barry Devlin Subscribe to this blog's RSS feed!

Barry Devlin

As one of the founders of data warehousing back in the mid-1980s, a question I increasingly ask myself over 25 years later is: Are our prior architectural and design decisions still relevant in the light of today's business needs and technological advances? I'll pose this and related questions in this blog as I see industry announcements and changes in way businesses make decisions. I'd love to hear your answers and, indeed, questions in the same vein.

About the author >

Dr. Barry Devlin is among the foremost authorities in the world on business insight and data warehousing. He was responsible for the definition of IBM's data warehouse architecture in the mid '80s and authored the first paper on the topic in the IBM Systems Journal in 1988. He is a widely respected consultant and lecturer on this and related topics, and author of the comprehensive book Data Warehouse: From Architecture to Implementation.

Barry's interest today covers the wider field of a fully integrated business, covering informational, operational and collaborative environments and, in particular, how to present the end user with an holistic experience of the business through IT. These aims, and a growing conviction that the original data warehouse architecture struggles to meet modern business needs for near real-time business intelligence (BI) and support for big data, drove Barry’s latest book, Business unIntelligence: Insight and Innovation Beyond Analytics, now available in print and eBook editions.

Barry has worked in the IT industry for more than 30 years, mainly as a Distinguished Engineer for IBM in Dublin, Ireland. He is now founder and principal of 9sight Consulting, specializing in the human, organizational and IT implications and design of deep business insight solutions.

Editor's Note: Find more articles and resources in Barry's BeyeNETWORK Expert Channel and blog. Be sure to visit today!

IDAA_heart1.jpgWell, perhaps not close to your heart, but certainly close to the heartbeat of your business.  This is a key message of an IBM Virtual Event debuting at 10:30 a.m. EST in the U.S. and 10:30 a.m. GMT / 11:30 a.m. CET in Europe on November 28, where I'll talk about modern mission-critical Business Analytics.

For many businesses, embedding operational analytics in the heart of their OLTP (online transaction processing) applications is a key initiative for 2013. The leaders, of course, have already begun.  The old operational data store (ODS) and operational BI were precursors as far back as the mid-90s, attempting to make faster decisions about operational matters.  These initiatives have had their success stories, but they have been limited by a number of factors, both analytical and operational.  The analytical issue has often been the lack of sufficient quantities of transaction and event data to effective mine.  The operational aspect was the ability to get close enough to the near real-time responses required by business users and customers.  

Both of these issues are being addressed with today's technologies.  The enormous growth of business on the Web in the past decade has meant that customer behavior can be analyzed through clickstreams within websites and linkages across different websites, call centers and more. Such information, analyzed in combination with transaction data, allows retailers to more effectively cross-sell, hotels to increase room occupancy and telcos to reduce churn.  But, for this blog, and the above event, the more interesting point relates to how to close the real-time gap.

Traditionally, business intelligence operates on data that has been extracted from the operational environment and analytic outcomes applied back to that environment afterwards. In short, the data is brought to the analytics.  This approach introduces significant delays.  An obvious solution would be to bring the analytics to the data; however, prior technology did not easily allow that.  I discuss this in terms of the mainframe, System z, environment, but the principle applies elsewhere too.

It is an oft-forgotten fact that 70% of all data transactions in the banking, insurance, retail, telecommunications, utilities and government industries still occur on the System z platform, due to its performance, cost, reliability and security characteristics.  The inclusion of the Netezza-powered IBM DB2 Analytic Appliance within the System z complex creates a system with a dual personality -transactional performance of the original environment combined with the analytic performance of Netezza required for integrated operational analytics.  With the inclusion of SPSS Predictive Analytics on Linux and Cognos on the zOS and Linux platforms, the need to move data out of the System z environment is largely eliminated.  More details are to be had in the Virtual Event where IBM's Dan Wardman and David Jeffries will fill in the technical details. See also my White Paper, "Integrating Analytics into the Operational Fabric of Your Business, A combined platform for optimizing analytics and operations".

Irrespective of platform, it is becoming increasingly clear that when it comes to operational decisions, they have to come from the heart rather than the head!



Posted November 27, 2012 12:54 AM
Permalink | No Comments |

Leave a comment

    
   VISIT MY EXPERT CHANNEL

Search this blog
Categories ›
Archives ›
Recent Entries ›