We use cookies and other similar technologies (Cookies) to enhance your experience and to provide you with relevant content and ads. By using our website, you are agreeing to the use of Cookies. You can change your settings at any time. Cookie Policy.

Blog: Barry Devlin Subscribe to this blog's RSS feed!

Barry Devlin

As one of the founders of data warehousing back in the mid-1980s, a question I increasingly ask myself over 25 years later is: Are our prior architectural and design decisions still relevant in the light of today's business needs and technological advances? I'll pose this and related questions in this blog as I see industry announcements and changes in way businesses make decisions. I'd love to hear your answers and, indeed, questions in the same vein.

About the author >

Dr. Barry Devlin is among the foremost authorities in the world on business insight and data warehousing. He was responsible for the definition of IBM's data warehouse architecture in the mid '80s and authored the first paper on the topic in the IBM Systems Journal in 1988. He is a widely respected consultant and lecturer on this and related topics, and author of the comprehensive book Data Warehouse: From Architecture to Implementation.

Barry's interest today covers the wider field of a fully integrated business, covering informational, operational and collaborative environments and, in particular, how to present the end user with an holistic experience of the business through IT. These aims, and a growing conviction that the original data warehouse architecture struggles to meet modern business needs for near real-time business intelligence (BI) and support for big data, drove Barry’s latest book, Business unIntelligence: Insight and Innovation Beyond Analytics, now available in print and eBook editions.

Barry has worked in the IT industry for more than 30 years, mainly as a Distinguished Engineer for IBM in Dublin, Ireland. He is now founder and principal of 9sight Consulting, specializing in the human, organizational and IT implications and design of deep business insight solutions.

Editor's Note: Find more articles and resources in Barry's BeyeNETWORK Expert Channel and blog. Be sure to visit today!

Recently in Business Integrated Insight Category

AIW.pngIn 1988, I published the first data warehouse architecture.  Its aim was to provide consistent, integrated data to business users in support of cross-enterprise decision making.  Quality and consistency were the key drivers; at that time the major issues were that operational / transactional systems were highly inconsistent and direct access to them was discouraged for reasons of performance and security.  Business users were happy to get whatever consistent view they could, and, in general, wanted to see a stable representation of the business on a monthly, weekly or occasionally daily basis.  This architecture has remained a foundation of business intelligence ever since.

21 years later, in 2009, I introduced Business Integrated Insight (BI2).  With emerging needs like near real-time decision making in operational BI and increasing use of non-traditional data coming from Web 2.0 and other sources, this new architecture had to address a far wider scope than the original data warehouse.  While consistency and integrity remain important considerations, today's business needs are far more about instant access to the ever-changing ebb and flow of trends in sales, manufacturing and more.  It was becoming clear that a new, over-arching architecture was required to cover all the information, processes and people of the business.

Now, three years later, it's clear that traditional BI is racing to keep up with developments in big data, data virtualization and the cloud, mobile computing as well as social networking and collaboration.  All these topics were incorporated in BI2 from the outset.  Now, as the technology moves to the mainstream, we can and must to dive deeper in these specific areas.  Big data leads clearly to the impossibility of routing all information through an enterprise data warehouse (EDW).  But, how will that impact our need for consistency and integrity?  I envisage we will move from the old adage of "a single version of the truth" to multiple versions depending on users' needs, with one particular version that I call core "business information" being the source of truth for external reporting and financial governance needs.  

Data virtualization has also become big news in recent years.  In many ways, it's a technology whose time has come.  With the explosion of data volumes and varieties, users need ways to combine data on the fly with confidence and performance.  Data virtualization addresses these needs and is increasingly overlapping with function we traditionally associate with ETL.  The result, data integration, as it's sometimes called, enables us to envisage a future where data is made available to users as they need it, whether real-time or integrated and historicized.  

And, against the background of all this upheaval in data and infrastructure, we also see a new breed of technology-savvy business users moving into positions of power.  These so-called millennials are demanding seamless, mobile access to the information they need, as well as the ability to play with it as required.  The rule of IT over the data and application resources of the organization is coming to an end.  But, that's not to say that IT has no future role.  In fact, I see more of a fully symbiotic partnership between business and IT emerging, a partnership I call the "biz-tech ecosystem".

My 2012 BI2 Seminar in Rome on 11-12 June explores these new directions and provides guidance on their introduction in your existing data warehouse environment.  It also introduces the Advanced Information Warehouse, shown above, as the next step on your journey from a traditional data warehouse to comprehensive business integrated insight.

Posted May 28, 2012 6:20 AM
Permalink | No Comments |
CognitiveComputingWatson.jpgI've previously written about IBM Watson, its success in "Jeopardy!" and some of the future applications that its developers envisaged for it.  IBM has moved the technology towards the mainstream in a number of presentations at the Information on Demand (IOD) Conference in Las Vegas last week.  While Watson works well beyond the normal bounds of BI, analyzing and reasoning in soft (unstructured) information, the underlying computer hardware is very much the same (albeit faster and bigger) as we have used since the beginnings of the computer era.

But, I was intrigued by an announcement that IBM made in August last that I came across a few weeks ago:
"18 Aug 2011: Today, IBM researchers unveiled a new generation of experimental computer chips designed to emulate the brain's abilities for perception, action and cognition. The technology could yield many orders of magnitude less power consumption and space than used in today's computers.

In a sharp departure from traditional concepts in designing and building computers, IBM's first neurosynaptic computing chips recreate the phenomena between spiking neurons and synapses in biological systems, such as the brain, through advanced algorithms and silicon circuitry. Its first two prototype chips have already been fabricated and are currently undergoing testing.

"Called cognitive computers, systems built with these chips won't be programmed the same way traditional computers are today. Rather, cognitive computers are expected to learn through experiences, find correlations, create hypotheses, and remember - and learn from - the outcomes, mimicking the brains structural and synaptic plasticity... The goal of SyNAPSE is to create a system that not only analyzes complex information from multiple sensory modalities at once, but also dynamically rewires itself as it interacts with its environment - all while rivaling the brain's compact size and low power usage."

Please excuse the long quote, but, for once :-), the press release says it as well as I could!  For further details and links to some fascinating videos, see here.

What reminded me of this development was another blog post from Jim Lee in Resilience Economics entitled "Why The Future Of Work Will Make Us More Human". I really like the idea of this, but I'm struggling with it on two fronts.

Quoting David Autor, an economist at MIT, Jim argues that that outsourcing and "othersourcing" of jobs to other countries and machines respectively are polarizing labor markets towards opposite ends of the skills spectrum: low-paying service-oriented jobs that require personal interaction and the manipulation of machinery in unpredictable environments at one end and well-paid jobs that require creativity, ambiguity, and high levels of personal training and judgment at the other.  The center-ground - a vast swathe of mundane, repetitive work that computers do much better than us - will disappear.  These are jobs involving middle-skilled cognitive and productive activities that follow clear and easily understood procedures and can reliably be transcribed into software instructions or subcontracted to overseas labor.  This will leave two types of work for humans: "The job opportunities of the future require either high cognitive skills, or well-developed personal skills and common sense," says Lee in summary.

My first concern is the either-or in the above approach; I believe that high cognitive skills are part and parcel of well-developed personal skills and common sense.  At which end of this polarization would you place teaching, for example?  Education (in the real meaning of the word - from the Latin "to draw out" - as opposed to hammering home) spans both ends of the spectrum.

From the point of view of technology, my second concern is that our understanding of where computing will take us, even in the next few years, has been blown wide open, first by Watson and now by neurosynaptic computing.  What we've seen in Watson is a move from Boolean logic and numerically focused computing to a way of using and understand and using soft information that is much closer to the way humans deal with it.  Of course, it's still far from human.  But, with an attempt to "emulate the brain's abilities for perception, action and cognition", I suspect we'll be in for some interesting developments in the next few years.  Anyone else remember HAL from "2001, A Space Odessey"?

Posted November 1, 2011 5:30 AM
Permalink | No Comments |
Last week I blogged on the Future of Business Intelligence and today as I'm making final preparations for two events dealing with this topic, I'd like to add a few further thoughts that struck me and deserve some emphasis.  (The events are in Brussels tomorrow, 15 June and in Rome on 22-24 June.  Between them they bring together an outstanding line-up of international speakers, including Rick van der Laans and Jan Henderyckx in the first event and Colin White, Claudia Imhoff, Cindi Howson and James Taylor at the second.  Not to mention yours truly at both!)

To start, two trends in addition to last week's five:

(6)    Operational vs. informational distinctions are disappearing.  In the past, decision support was seen as a relatively relaxed process, typically based on a stable, point-in-time view of the business.  Today, many decisions need to be made based on near real-time data.  This process is known as operational BI and breaks down the boundaries between operational and informational systems, a distinction on which the original BI architecture is based.

(7)    Service oriented architecture (SOA) is reinventing the application landscape.  Although SOA has been promoted since the early 2000s, its uptake has been relatively slow.  However, most application developers recognize that it is the only concept that has the potential to address the business need for more rapid and flexible application development.  The impact on BI is significant.  ETL techniques will need to radically change as applications become plug-and-play.  On the positive side, SOA offers the best chance in twenty years to finally address the metadata problem in BI--simply because SOA can only work if complete, live metadata exist in the the operational environment.

It is my firm belief that these seven trends, when taken together, spell the end of our current way of thinking about and implementing Business Intelligence.  I'm not suggesting that BI is at the end of its useful life.  Far from it.  In business terms BI is growing dramatically in popularity and value delivered to business users, its influence has extended across the entire spectrum of business activities--from the board room all the way down to the factory floor--and exciting and profitable new uses are being constantly invented.  However, this success and ever-broadening remit has significant consequences.  

First, it becomes clear that the boundary assumptions and resulting structures that framed BI twenty years ago have become totally obsolete.  For example, what forward-looking business manager today would be satisfied with traditional BI month-end data when making daily operational decisions?  Or be willing to operate with only internally sourced data?  The rules have changed.  The required timeliness and breadth of information today are dramatically different than seen two decades ago when the BI architecture was emerging.

Second, BI today encroaches into areas traditionally covered by other IT disciplines.  Operational BI and collaborative BI are obvious examples of the blurring of boundaries that were once clearly demarcated in both technological and organizational terms.

My personal conclusion, about which I've written extensively elsewhere, is that we need a new architecture for BI that encompasses the entire gamut of IT support for business.  Essentially, we need to move our thinking up a level from separate operational, collaborative and BI architectures to a comprehensive, joined-up Enterprise IT Architecture that I call Business Integrated Insight (BI to the power of two).

Of course, I'm not alone in thinking about the future of BI in this time of dramatic change.  So, come along to the conferences in Brussels or Rome to hear what other leaders in the field think.

Posted June 14, 2011 6:36 AM
Permalink | No Comments |
If there's one thing BI folks love more than playing with data, it's classifying it.  And one of the endless debates has been about the data, information, and knowledge taxonomy.  How we could simplify our lives if there was only one all-encompassing term!  Or maybe not...

I was teaching a Business Integrated Insight (BI2) seminar in Helsinki a couple of weeks ago and a TDWI member and Development Director of Visual Management Ltd, Vesa Tiirikainen, shared with me that in Finnish, there is only one word that covers data, information and knowledge and that word is "tieto".  So I asked myself: what if there was only one word in English?  Would our perception of IT change?  What would it mean for topics like Data Governance?

First, let's take a look at the traditional meanings of the English words.  From the Oxford English Dictionary, we have:
Data: (1) facts and statistics collected together for reference or analysis, and (2) the quantities, characters and symbols on which operations are performed by a computer
Information: facts provided or learned about something or someone
Knowledge: facts, information and skills acquired by a person through experience or education; the theoretical or practical understanding of a subject

The definitions are remarkably similar.  All are about facts, and interestingly, not about opinions, beliefs, etc. which seems a huge omission to me.  Data is a little more about how facts are manipulated.  Knowledge has more of a people orientation.  But, in general, not too helpful!

So allow me to propose perhaps more meaningful definitions.  Information, to me, is the starting point.  Information describes real-world objects in words, numbers, pictures and similar artifacts of communication in a way that is suitable for human cognition, use and processing.  It is inherently vague, context-dependent and open to interpretation.  I call it soft information.  Data is extracted from information by a process (modeling) of formalizing meaning and separating structure and meaning (metadata) from values.  This I call hard information.  Finally, knowledge is information that is understood and internalized by people so that it can be put to practical and innovative use.

On that basis, could we manage with a single word to cover all categories?  As BI practitioners, I suspect not.  Data governance and information management also demand such clarity, as I'm sure Mike Ferguson will touch upon in Rome on May 9-11.  Understanding these categories has been vital to me as I've created the BI2 architecture, because they require fundamentally different processing methods, storage approaches, and so on.

But users?  From the business user's viewpoint, I believe we should use a single word - and let's choose "information".  In the business view, the underlying structure and metadata separation of data is actually unimportant.  And, increasingly, computers can store, manipulate and make available soft and hard information in a seamless fashion.  Knowledge, too, is being folded into the computer world through social media and networking approaches where the human interaction around a set of information becomes a key component of how information is used.

In the real world, the Finns probably have it right - one word should be sufficient.  But for BI and IT folks, we just need to make life more complex... but for good reason!

Posted April 26, 2011 10:43 AM
Permalink | No Comments |
Following on from a previous discussion on the need for integrated information in modern business, and the focus on a consolidated process in my last post, I'd like to complete the picture here with a look at the role of people in the Business Intelligence of the future.

Traditional BI typically sees people in two roles.  First, they are the largely passive and individual receivers of information via reports or even dashboards.  Once that information is delivered, the classic BI tool bows out; even though the real value of the process comes only when the decision is made and the required action initiated.  Furthermore, the traditional BI process fails to link the action taken to results that can be measured in the real world, overlooking the vital concept of sense and respond described in my last post.  These two glaring omissions, among others, in the current BI world lead directly to the relatively poor proven return on investment in many BI projects.

Second, BI treats people as largely disconnected providers of "information requirements" to the BI development process, often leading to failed or, at best, disappointing BI solutions for real user needs.  Agile BI approaches do address this problem to some extent.  However, the real issue is that the process of innovative decision making is largely iterative with final information needs often differing radically from initial ideas on what the requirements may be.  The path from such innovative analyses to ongoing production reporting using the discovered metrics is also poorly understood and seldom delivered by most current BI tools.

The good news is that many of these issues are central to the techniques and tools of Web 2.0, social networking and related developments.  However, simply adding a chat facility or document sharing to an existing BI tool is insufficient.  To truly deliver BI for the People, we require a significant rethinking of the framework in which BI is developed and delivered.  This framework includes (1) a well-managed and bounded social environment where users can safely experiment with information and collaborate on their analyses, (2) support for peer review and promotion to production of analyses of wider or longer-term value, and (3) an adaptive, closed-loop environment where changes in the real-world can be directly linked to actions taken and thus to the value of the analyses performed.

Today's users are of a different generation to those BI has previously supported.  Gereration Y (born 1980-2000, or thereabouts, depending on which social scientist you follow) is the first generation to have grown up with pervasive electronic connectivity and collaboration in their personal lives.  They bring these expectations, as well as some very different social norms, into the business world, and are now beginning to assume positions of decision-making responsibility in their organizations.  They are set to demand radical changes in the way we make and support decisions in business.

I'll be discussing these issues at three seminars I'm presenting on the transformation of BI into Enterprise IT Integration in Europe: a half day each in Copenhagen and Helsinki (4 and 5 April) and a full two-day deep dive in Rome (11-12 April).  

Posted March 29, 2011 3:56 AM
Permalink | 3 Comments |
PREV 1 2 3


Search this blog
Categories ›
Archives ›
Recent Entries ›