Blog: Barry Devlin Subscribe to this blog's RSS feed!

Barry Devlin

As one of the founders of data warehousing back in the mid-1980s, a question I increasingly ask myself over 25 years later is: Are our prior architectural and design decisions still relevant in the light of today's business needs and technological advances? I'll pose this and related questions in this blog as I see industry announcements and changes in way businesses make decisions. I'd love to hear your answers and, indeed, questions in the same vein.

About the author >

Dr. Barry Devlin is among the foremost authorities in the world on business insight and data warehousing. He was responsible for the definition of IBM's data warehouse architecture in the mid '80s and authored the first paper on the topic in the IBM Systems Journal in 1988. He is a widely respected consultant and lecturer on this and related topics, and author of the comprehensive book Data Warehouse: From Architecture to Implementation.

Barry's interest today covers the wider field of a fully integrated business, covering informational, operational and collaborative environments and, in particular, how to present the end user with an holistic experience of the business through IT. These aims, and a growing conviction that the original data warehouse architecture struggles to meet modern business needs for near real-time business intelligence (BI) and support for big data, drove Barry’s latest book, Business unIntelligence: Insight and Innovation Beyond Analytics, now available in print and eBook editions.

Barry has worked in the IT industry for more than 30 years, mainly as a Distinguished Engineer for IBM in Dublin, Ireland. He is now founder and principal of 9sight Consulting, specializing in the human, organizational and IT implications and design of deep business insight solutions.

Editor's Note: Find more articles and resources in Barry's BeyeNETWORK Expert Channel and blog. Be sure to visit today!

ebenezer_scrooge.jpg"Oracle Exalytics enables organizations to make decisions faster... through the introduction of interactive visualization capabilities that make every user an analyst" from Oracle's Exalytics press release of 3 October.

"The ultimate challenge... is putting enough useful Big Data capabilities into the hands of the largest number of workers. The organizations that figure out this part will reap corresponding rewards." Dion Hinchcliffe's recent post "The enterprise opportunity of Big Data: Closing the 'clue gap'".

Sorry to sound like Ebenezer Scrooge of Dickens' "A Christmas Carol", but... Bah, humbug!

Some business users have been doing analytics for years... in Excel.  Do we consider that an Information Management success story?  Have the business benefits far outweighed the costs in terms of users' time, IT's efforts in trying to provide data or, indeed, the numerous spreadsheet-induced business mistakes and mis-stated statutory reports?  In a word, no!

So, what do you think?  Will providing growing volumes of increasingly diverse data through ever more sophisticated and speedy statistical analysis tools make this situation better or worse?  Furthermore, does every user want or need to be a statistician?

I believe that we are in danger of being caught on a hype wave here.  Extreme analytics and big data certainly have an important role to play in modern business.  But that role is in exploration and innovation of new opportunities for or threats to the business.  For many managers, regular reports and the ability to drill down into exceptions and outliers are as much as they need.  In other words, traditional BI.  For much of the business, the focus is on the minutiae and the mundane.  For daily decisions--and such decisions are the heartbeat of the business--the information required and the implications of the vast majority of possible circumstances are already largely known.  Big data and extreme analytics are unnecessary.  What is required is faster access to current transaction data or easier access to background content.

We've known this fork in BI for many years.  It's the difference between tactical/strategic and operational BI.  And while analytics and big data are getting the publicity, much is going on to restructure and re-architect the foundations of traditional BI.  One of these advances is data virtualization.

The emergence of big data has, of course, made data virtualization a mandatory technology for BI.  Given the volumes of data involved, it makes less and less sense to duplicate data on the scale we do it today.  And reduced duplication means that remote access, federation, EII or whatever term you like becomes a key component of any modern BI architecture. I'll be discussing this at the kickoff webinar today of Denodo's Data Virtualization World Series, available also on-demand from B-eye-Network.

So while we're dreaming dreams of extreme analytics and big data in Christmas' Future, let us also keep our eyes firmly fixed on Christmas Present and how we meet the current needs of the majority of ordinary business users.


Posted October 5, 2011 6:54 AM
Permalink | 1 Comment |

1 Comment

I could not agree more. As the organizations delve into BI - it seems the data tables start growing and then common users do not understand what they are supposed to count.

Then they are saying ... "I just want to know how many widgets we have not a complete inventory timeline with all fields." The other danger is when everyone is a report builder, then it is very likely that there will be more than one version of the truth.

Oh well .. bah humbug for me also. :-)

Tricia
http://www.bi-notes.com

Leave a comment