Blog: Barry Devlin Subscribe to this blog's RSS feed!

Barry Devlin

As one of the founders of data warehousing back in the mid-1980s, a question I increasingly ask myself over 25 years later is: Are our prior architectural and design decisions still relevant in the light of today's business needs and technological advances? I'll pose this and related questions in this blog as I see industry announcements and changes in way businesses make decisions. I'd love to hear your answers and, indeed, questions in the same vein.

About the author >

Dr. Barry Devlin is among the foremost authorities in the world on business insight and data warehousing. He was responsible for the definition of IBM's data warehouse architecture in the mid '80s and authored the first paper on the topic in the IBM Systems Journal in 1988. He is a widely respected consultant and lecturer on this and related topics, and author of the comprehensive book Data Warehouse: From Architecture to Implementation.

Barry's interest today covers the wider field of a fully integrated business, covering informational, operational and collaborative environments and, in particular, how to present the end user with an holistic experience of the business through IT. These aims, and a growing conviction that the original data warehouse architecture struggles to meet modern business needs for near real-time business intelligence (BI) and support for big data, drove Barry’s latest book, Business unIntelligence: Insight and Innovation Beyond Analytics, now available in print and eBook editions.

Barry has worked in the IT industry for more than 30 years, mainly as a Distinguished Engineer for IBM in Dublin, Ireland. He is now founder and principal of 9sight Consulting, specializing in the human, organizational and IT implications and design of deep business insight solutions.

Editor's Note: Find more articles and resources in Barry's BeyeNETWORK Expert Channel and blog. Be sure to visit today!

Recently in Social Networking Category

crystal ball.jpgIt's that time of year when every analyst worth his or her salt is making predictions for the coming year.  Acquisitions.  Big data.  Mobile BI.  Cloud.  Social media.  Predictive analytics... hey! Wait a minute!

My question is: how many of these predictions about BI 2012 are based on the use of predictive analytics?  My hunch is... none.  Perhaps I'm being unfair?  Is it predictive analytics to use all those surveys of buying intentions as input?  What about using trend numbers for market share over the past few years? 

So, here is the Wikipedia definition: "Predictive analytics is an area of statistical analysis that deals with extracting information from data and using it to predict future trends and behavior patterns. The core of predictive analytics relies on capturing relationships between explanatory variables and the predicted variables from past occurrences, and exploiting it to predict future outcomes."  What do you think?  How is the fit?

I'm pushing this so hard because I have long observed that in the most important decisions--whether in personal life or in business--we often trust our "gut feel", our intuition over measurable facts.  How many of us are purely rational in our decision making?  Even if we do make a list of pros and cons, because our weighting system is at best invented on the spot or at worst non-existent, we often end up at least as confused as when we started.  Or is this just my experience?  Well, yes, because I haven't carried out that survey...

Am I saying that BI is useless?  No, not at all.  Simply, that like every other tool, it's good for some things but not for others.  And if you can accept that, it follows that we need to expand the scope of what we do in decision making support.  The name "business intelligence" seems far too rational and limited for what the type of support that most decision makers need when making decisions.  For me, one of the most important trends in the past year or two has been the focus on the social aspect of decision making, as pioneered by Lyza, but now promoted by pretty much every BI vendor.  However, social networking support means a lot more than allowing comments or discussions on reports, dashboards and various visualizations.  It actually means providing a comprehensive, coherent environment where all interactions around a particular decision are recorded and tracked.  And, the last time I checked, much of that interaction occurs before the data crunching starts, and another chunk long after the BI tool has been shut down.  Decision making support needs to go far wider than the majority of BI vendors even imagine.

So, am I predicting that 2012 will be the year when BI tools finally "get it" that they are not the center of the decision maker's universe?  When BI vendors stop adding social networking bells and whistles to their tools and instead figure out how to be part of a larger Enterprise 2.0 effort unifying all social interactions around conversations, documents, analyses and more within the organization?

No, I don't think I can predict the future.  What I can say is that it's up to you to make your future, and one area where technology, both hardware and software, is rapidly improving is social networking support.  It would therefore be valuable to give some serious thought and focus in 2012 to how you could support collaborative decision making in the coming few years.


Posted December 19, 2011 10:15 AM
Permalink | 2 Comments |
Lyzasoft's Scott Davis returned to the Boulder BI Brain Trust (BBBT) last Friday with Lyza version 3.0.  Here's my key takeaway.  Lyza is the first product I've seen that truly understands and delivers the type of collaborative environment needed for innovative and effective decision making.

Please read that last sentence again--the type of collaborative environment needed for innovative and effective decision making.  Note that I didn't say "collaborative Business Intelligence".  I believe that phrase can be a bit misleading, especially to BI people.  And Lyzasoft gets that.  Let me explain...

If you track back to the initial release of Lyza, the product focused on supporting the often iterative data analysis process that business analysts go through in order to reach conclusions and come to decisions.  This is a process that typically happens in the spreadsheet environment, because of the ease of trying, sharing and redoing it offers.  Lyza offered an environment that enabled better control and management of that environment.  And, most importantly, they began to build around that BI tool a collaborative environment where analyses and results could be shared and reused.  For more details, see "Playmarts: Agility with Control--Reconnecting Business Analysts to the Data Warehouse" and "Collaborative Analytics--Sharing and Harvesting Analytic Insights across the Business", two white papers I wrote in late 2008 and mid-2009.

Now fast-forward to last week and version 3.  What Scott demonstrated at the BBBT was entirely about collaboration.  The analytics tool that was Lyza version 1 was still there, but it had become simply one of any number of tools that a decision maker might use.  The focus of the new release is now firmly, and perhaps entirely, on supporting the collaborative process around decision making.  Rather than emphasizing the data, Lyza 3 seeks the intersection between people, their activities and the artifacts they create, use and share.  This emphasis on people, activities and things is not new in itself; what is new is the intuitive linkage between them and the focus on decision making and action taking that comes from prior Lyza BI tooling.  What we have here is what a decision support system really should look like--it's about supporting decision making, doh!

If we can look beyond the current hype on big data, the bling of tablets and the search for the holy grail of visualization, it becomes pretty clear that the only thing that finally matters in BI is the decision made and the action taken.  And... by understanding how the decision makers got there to enable them to more easily and effectively repeat and refine that process in the future.  This puts Lyza on the cusp of the next big emerging trend in IT--the "automation" of the human interactions that occur around the data and applications that IT already provides.  I place quotes around "automation" because, of course, this will be a very different type of automation than we have been used to in the past.  This is the integration of Web 2.0 concepts and tools into the enterprise.  Facebook with a purpose.  Twitter in context.  Social networks with a goal.

With version 3, Lyza has stepped boldly beyond the safe and well-understood confines of what BI has mostly thought about so far.  For some, it may pose the question: shouldn't this type of function come from another market with a different audience?  My response, in the form of another question is: what other market and audience should be looking at supporting, really supporting, decision makers?

The new collaborative Lyza will be available for free use from October.  I highly recommend giving it a test drive!

Posted September 14, 2011 10:53 AM
Permalink | No Comments |
Following on from a previous discussion on the need for integrated information in modern business, and the focus on a consolidated process in my last post, I'd like to complete the picture here with a look at the role of people in the Business Intelligence of the future.

Traditional BI typically sees people in two roles.  First, they are the largely passive and individual receivers of information via reports or even dashboards.  Once that information is delivered, the classic BI tool bows out; even though the real value of the process comes only when the decision is made and the required action initiated.  Furthermore, the traditional BI process fails to link the action taken to results that can be measured in the real world, overlooking the vital concept of sense and respond described in my last post.  These two glaring omissions, among others, in the current BI world lead directly to the relatively poor proven return on investment in many BI projects.

Second, BI treats people as largely disconnected providers of "information requirements" to the BI development process, often leading to failed or, at best, disappointing BI solutions for real user needs.  Agile BI approaches do address this problem to some extent.  However, the real issue is that the process of innovative decision making is largely iterative with final information needs often differing radically from initial ideas on what the requirements may be.  The path from such innovative analyses to ongoing production reporting using the discovered metrics is also poorly understood and seldom delivered by most current BI tools.

The good news is that many of these issues are central to the techniques and tools of Web 2.0, social networking and related developments.  However, simply adding a chat facility or document sharing to an existing BI tool is insufficient.  To truly deliver BI for the People, we require a significant rethinking of the framework in which BI is developed and delivered.  This framework includes (1) a well-managed and bounded social environment where users can safely experiment with information and collaborate on their analyses, (2) support for peer review and promotion to production of analyses of wider or longer-term value, and (3) an adaptive, closed-loop environment where changes in the real-world can be directly linked to actions taken and thus to the value of the analyses performed.

Today's users are of a different generation to those BI has previously supported.  Gereration Y (born 1980-2000, or thereabouts, depending on which social scientist you follow) is the first generation to have grown up with pervasive electronic connectivity and collaboration in their personal lives.  They bring these expectations, as well as some very different social norms, into the business world, and are now beginning to assume positions of decision-making responsibility in their organizations.  They are set to demand radical changes in the way we make and support decisions in business.

I'll be discussing these issues at three seminars I'm presenting on the transformation of BI into Enterprise IT Integration in Europe: a half day each in Copenhagen and Helsinki (4 and 5 April) and a full two-day deep dive in Rome (11-12 April).  

Posted March 29, 2011 3:56 AM
Permalink | 3 Comments |
Preparing materials for a seminar really forces you to think!  I just finished the slides for my two-day class in Rome next week, and after I got over my need for a strong drink (a celebration, of course), I got to reflect on some of what I had discovered.

Perhaps the most interesting was the amazing changes in the database area that have been happening over the past couple of years.  A combination of hardware advances and software innovations have come together with a recognition that data is no longer what it once was to pose some fundamental questions about how databases should be constructed.

Let's start on the business side - always a good place to start.  Users now think that their internal IT systems should behave like a combination of Google, Facebook and Twitter.  Want an answer to the CEO's question on plummeting sales?  Just do a "search", maybe "call a friend", join it all together and voila!  We have the answer. 

From an information viewpoint, this brings up some very challenging questions about the intersection of soft (aka unstructured) information and hard (structured) data and how one ensures consistency and quality in that set.  IT's problem is no longer just combining hard data from different sources; it's about parsing and qualifying soft information as well.  This is not a truly new problem.  Data modelers have struggled with it for years.  It's the speed with which it needs to be done that causes the problem.

So, what has this got to do with new software and hardware for databases?  Well, the key point is that database thinking has suddenly moved on from strict adherence to the relational paradigm.  The relational model is an extraordinarily structured view of data.  Relational algebra is a very precise tool for querying data.  You need to have a strong understanding of both to make valid queries, but do you really want your users to think that way?  Should you necessarily store the information physically in that model?  When you free yourself of these assumptions, you can begin to think in new ways.  Store the data in columns instead of rows?  Perfect!  A mix of row- and column-oriented data, and maybe some in memory only?  Yes, can do!  And then there's mixing searching (a soft information concept) with querying (a hard data thought) to create a hybrid result.  That's easy too!

And on the edges of the field, there are even more fundamental questions being asked.  Do we need always need consistency in our databases?  Can we do databases without going to disk for the data?  Could we do away with physically modeling the data and just let the computer look after it?  The answers to these questions and more like them are not what you might expect if you've been around the database world for 20 years.  And with those different answers, the overall architecture of your IT systems is suddenly open to dramatic change.

Believe me, the first businesses to adopt some of these approaches are going to gain some extraordinary competitive advantages.  Watch this space!

Posted April 8, 2010 9:58 AM
Permalink | No Comments |
Having worked with CEO, Scott Davis of Lyzasoft and produced a white paper on Collaborative Analytics in the first half of 2009, it came as no surprise to me that version 2.0 of Lyza had a major emphasis in the same area.  What did surprise me, however, was how far they have advanced the concepts and implementation in such a short timeframe!

Successful collaboration between decision makers requires an environment that facilitates a free-flowing but well-managed conversation about ongoing analyses as they evolve from initial ideas to full-fledged solutions to business problems.  Consider a common scenario.  The first analyst gathers data she considers relevant and creates an initial set of assumptions, data manipulations and results.  She shares this via e-mail with her peers for confirmation, and she receives suggestions for improvement, some of which she incorporates in a new version.  Her manager reviews the work personally and makes further suggestions; a new version emerges.  She also shared the intermediate solution with a second department, and the analyst there created another solution based on the original.  Meanwhile, the first analyst finds an error in her logic buried deep in cell Sheet3!AB102...

We all know the problems with multiple unmanaged copies, rework, silently propagated errors and so on in the usual spreadsheet- and e-mail-based business analysis environment.  Lyza and Lyza Commons together address these issues by creating a comprehensive tracking and auditing mechanism for every step of an analysis and providing an integrated environment for sharing and discussing work among collaborators.  Integral metadata links all copies derived from an initial analysis.  Twitter-like conversations (called Blurbs) about an analysis are linked to the referenced object creating a comprehensive context for the conversation and the underlying analysis.  The folks at Lyzasoft have also come up with a security concept for sharing analyses they call Mesh Trust that should make sense in most enterprise collaboration environments.

My bottom line?  Lyza and Lyza Commons 2.0 provide a seamless blending of analytic function, managed and controlled access to information resources and enterprise-adapted social networking around analytic results and their provenance.  This is precisely the type of function needed by businesses who want to regain control of spreadmarts that have run amok.  This is the right conceptual foundation for real, meaningful business insight and innovation going forward.

Posted February 25, 2010 2:58 PM
Permalink | No Comments |
PREV 1 2