<-- Back to full color view

Business Intelligence Pervasiveness (and is it Wise to Ignore those Shadow Spreadbasemarts?)

Originally published March 26, 2009

I attended some sessions at Gartner’s 2009 Business Intelligence Summit. At one of the sessions, Gartner Distinguished Analyst Donald Fineberg discussed “The CIO’s view of Business Intelligence,” and provided some reflection of a commonplace pattern across the IT community, namely the senior IT staff members do not effectively engage business sponsors when communicating the value of business intelligence (BI). This was not a particular surprise to me, nor to many of the others listening to the talk. I was, though, bewildered about two other notions that were brought up: one was the focus of a lengthy chunk of presentation bandwidth, while the other was almost a “one-off” comment tossed out and then completely dropped.

The first notion had to do with the concept of the “Business Intelligence Competency Center,” or the “BICC.” One specific comment suggested that there had been continuous advice for fifteen years to IT departments to lobby the business side for support of a BICC, to manage tools, techniques, and best practices associated with business intelligence, but that few organizations had gotten around to doing it. But what does a BICC do? Apparently (based on a scan of articles, presentations, white papers, etc.), the BICC team standardizes platforms, communicates the value of business intelligence across the organizations, provides cross-functional collaboration, supports projects, aligns business intelligence and performance management with organizational business objectives, manages tools, provides data stewardship, performs requirements analysis, organize professionals and provide training.

These are all great things to do, but emerging trends may suggest that the notion of a BICC may become anachronistic as the capabilities supported by a business intelligence framework become more pervasive and embedded within everyday applications and business processes. Reporting, parameterized queries, and interactive analytics are examples of BI techniques that are rapidly being embedded within business applications without the need for the user to jump out and access a BI front end tool, and this may preempt the need for some aspects of the BICC. Better yet, there are many agile software vendors creating low cost tools (data analyzers, visualization engines, mash-up makers) that put the power of analysis in the hands of the business user, perhaps even “promoting” them into the lower echelon of the power user class. Lastly, more tools are integrating analytical models into the framework (e.g., Microsoft’s integration of data mining models into SQL Server) to let anyone try out bunches of models without needing to know how they work.

The one-off comment had to do with the multitude of desktop databases that exist across the organization. I am paraphrasing the speaker, who mentioned that the CIO’s focus on business intelligence was notwithstanding the thousands of spreadsheets and databases, but we could ignore those for now. The problem is that in our interactions with clients, it is exactly those “spreadbasemarts” that are used to generate the analyses and reports that bubble up through the presentation slideware into executive briefings. Desktop databases and spreadsheets enable business users to get the answers they need immediately without having to navigate the IT development lifecycle. In some places, these artifacts have become de facto components of production processes, with data extracted from one source, dumped into a desktop database, which is managed within some enterprise sharing environment, which then sources yet another production application. In essence, to some extent the business is being run using these shadow data repositories outside of the realm of the traditional IT mandate.

The offhandedness of the comment also suggested that the rampant proliferation of desktop spreadbasemarts was a bad thing; I beg to differ. Their existence is, perhaps, indicative of the same trends that contribute to the imminent obsolescence of the BICC – it puts the power in the hands of the people, who are stymied by the IT practices and want to game the system to get the answer they need right away.

I have a different problem with them, though. Clearly there are business dependencies on these data sets, but in the absence of governance, there is an even greater potential for inconsistency across the generated reports and presentations, leading to an increased reactive need for reconciliation. More critically, the reports generated are neither subject to review for correctness nor are they auditable, since the results may differ based on when the data was extracted, who generated the tables, and who conjured up the graphs. On the other hand, forcing desktop analytics into the yoke of IT governance may have the opposite effect, driving more business users into the shadows.

Perhaps the trick lies in clarifying what that governance really is – instead of the typical role-based infrastructure (that allocates additional work to fully-tasked individuals whose stewardship successes are neither measured nor recognized), institute a more operational set of guidelines that are specifically directed to the business analysts. Desktop analysis datasets should at least be registered with a centralized authority so that dependencies, impacts, and conflicts can be analyzed. Registering and sharing analyses or reports that use extracted data might provide benefit across the board, enabling notifications (or even automated updates) when the underlying data sets change. And as performance management and desktop analysis become more pervasive, adjusting the culture to one of sharing and collaboration may finally establish that community of interest (inherent in the concept of the BICC) based on user demand instead of IT fiat.

SOURCE: Business Intelligence Pervasiveness (and is it Wise to Ignore those Shadow Spreadbasemarts?)

Recent articles by David Loshin

 

Comments

Want to post a comment? Login or become a member today!

Posted June 11, 2009 by George Allen

Good article.  I work in a healthcare environment and have been surprised by the number of "spreadmarts" that are in use in this environment and how little of their intrinsic value has rolled up into the organizations BI infrastructure.  As I am rolling out the knowledge management structure for this orgainzation, one of my main tasks will be to document all of these little stores of data and determine which should be rolled up into the platform and to also establish some standardization of the tools.  Granted, I don't want to disturb the effectiveness of the tools as they are, but rather bring their value into the organization's decision tree and enhance the capabilities of the tool.

The BICC idea as laid out in this article is definitely a dinosaur, but appears to be the place I find my team in.  I am not part of IT, rather in a function within the director's (CEO) office, but looked upon as the "experts" in BI.  I think it is appropriate in a way so that the ownership of the process is not laid completely on IT but on the business side of the house.  IT is a part of the process, but should never drive it.

George O. Allen

NWIHCS VA

This comment is my own opinion and not the opinion or policy of the Department of Veterans Affairs.

 

Is this comment inappropriate? Click here to flag this comment.

Posted March 30, 2009 by Mark Flaherty

I think spreadmarts are facinating topic. I doubt there will be a perfect solution, but it is an interesting area to watch. We (InetSoft) make BI software, and our approach is try to accommodate business users as much as possible.

Besides making the software easy to use - which everyone should strive for, the novel thing we do is let users import those spreadmarts into the BI platform so they become shareable and more reliable. We also let users mashup internal and external data in the platform, which is a common reason for the xls in the first place. I think that's the direction vendors and internal IT need to go, remove frustrations with centralized BI, and make things easier for non-technical folks to get on board.

If anyone is interested in reading more on this data mashup topic and why it's good for business, I wrote a whitepaper on it at http://www.inetsoft.com/resources/prDM/evaluate/whitepapers/

Is this comment inappropriate? Click here to flag this comment.

Posted March 27, 2009 by David Loshin

Guys, thanks for both of your comments. Joseph, I share your desire to drive a means for managing one official version of the facts, but in many situations it is difficult, if not impossible to do when there is a difference between the "facts" and what everyone's perception of "truth" is. Even in the example you gave regarding sales tax, I am sure that there are some organizations that provide the data for an audit without themselves being completely convinced of its veractiy. But perhaps close to the facts is good enough in most cases.

 

John, I appreciate your suggestions as well, as long as managing the meta-meta-meta data doesn't get *too* out of hand. It can turn into a huge time sink with little perceived business benefit.

Is this comment inappropriate? Click here to flag this comment.

Posted March 26, 2009 by Joseph Subits

A few years ago, before the economy tanked and distracted those in the executive suite with more pressing matters, the notion that a mistake in the routine processing taking place in the production spreadmarts, that could lead to a materially relevant mistatement of the books, actually kept a few C-Level folks up at night.  In fact, one compliance software vendor has a great slide which cited 3 or 4 examples of significant fines and even jail time in one case because of a mistake made in a critical financial spreadsheet. With that much at stake, you start to wonder how most corporations can afford to NOT have one version of the facts.  Furthermore, after seeing the recent chaos in the world's financial institutions, maybe it is time for some sort of regulatory edict for all corporations to maintain an official version of the facts to some generally acceptable standard for all record keeping.  The closest that I have seen to anything like this are typical State sales tax audits.  A company must show that the sales amount for a particular period stated on their tax return ties out to the same amount in their consolidated general ledger, and local ledgers and customer invoice history data.  Most companies scramble to pull this together at the time of audit.  However, I'm willing to bet that at least one company out there can produce the entire balanced picture from their fully integrated and fiscally balanced enterprise data warehouse.     

Is this comment inappropriate? Click here to flag this comment.

Posted March 26, 2009 by John Mowbray

Loved it.More than 20 years ago, I had the same debates with IT operations and application delivery people about whether to manage distribution of mainframe data or just to continue to require access through the green screens.  And the work-arounds: hello screen scrapers and 3270 emulators!

I'll throw out a few other "offhand" thoughts that were triggered by the article.

Certainly the horsepower of computing has increased over the the years, but it seems the type of problem hasn't changed: faster machines, and newer software - targeting/tailoring more info for more workers at more levels.

I make the assumption that IT isn't the master of what and how user departments consume data; because it can't be - there's too much that changes too fast.

I believe there are a lot of things in the business and technology worlds that combine to make the user-xls approach the most effective (e.g. demonstrable/measurable costs and benefits, as well as local control of the requirements, solutions, delivery timing, and cost).

Since I'm a data/object modeller, I like the idea of rolling things up a level - e.g. generalization/inheritance.   It reduces the number of moving parts, and the reduced complexity lets me get my brain around more related bits - other types/classes of data, other sources and sinks, other patterns.  When I do that while mapping data element use to business processes,  I see each class of data has a many to many relationship with the real-row-level data elements in the DW/applications.  And each class has a many to many relationship with the uses-by-the-users.  It's hard to rationalize/document/control the many to many relationship of two many to many relationships.  So rather than controlling the individual rows to the individual use/users, why not co-operate on the management of the meta-data?  The data about the data, and about the process; about the people and the results.  Beauty.  MetaData Data Management

As you're pointing out though, there's a lot of xls-work that's carried out without involving the users' Business Analysis group (let alone the IT department's Systems Analysis group).  So targeting them for the MDDM may be the necessary first step.  Hopefully, there would be an expert group on each side - subject matter experts in data delivery and in data consumption.

Is this comment inappropriate? Click here to flag this comment.

 

Copyright 2004 — 2019. Powell Media, LLC. All rights reserved.
BeyeNETWORK™ is a trademark of Powell Media, LLC