We use cookies and other similar technologies (Cookies) to enhance your experience and to provide you with relevant content and ads. By using our website, you are agreeing to the use of Cookies. You can change your settings at any time. Cookie Policy.


Podcasts

Business Intelligence Pervasiveness (and is it Wise to Ignore those Shadow Spreadbasemarts?)


 

Originally published March 26, 2009


Overview

David Loshin examines the business intelligence competency center and ubiquitous spreadbasemarts.

David Loshin
David Loshin

 
 

Comments

Want to post a comment? Login or become a member today!

Posted June 11, 2009 by George Allen

Good article.  I work in a healthcare environment and have been surprised by the number of "spreadmarts" that are in use in this environment and how little of their intrinsic value has rolled up into the organizations BI infrastructure.  As I am rolling out the knowledge management structure for this orgainzation, one of my main tasks will be to document all of these little stores of data and determine which should be rolled up into the platform and to also establish some standardization of the tools.  Granted, I don't want to disturb the effectiveness of the tools as they are, but rather bring their value into the organization's decision tree and enhance the capabilities of the tool.

The BICC idea as laid out in this article is definitely a dinosaur, but appears to be the place I find my team in.  I am not part of IT, rather in a function within the director's (CEO) office, but looked upon as the "experts" in BI.  I think it is appropriate in a way so that the ownership of the process is not laid completely on IT but on the business side of the house.  IT is a part of the process, but should never drive it.

George O. Allen

NWIHCS VA

This comment is my own opinion and not the opinion or policy of the Department of Veterans Affairs.

 

Is this comment inappropriate? Click here to flag this comment.

Posted March 30, 2009 by Mark Flaherty

I think spreadmarts are facinating topic. I doubt there will be a perfect solution, but it is an interesting area to watch. We (InetSoft) make BI software, and our approach is try to accommodate business users as much as possible.

Besides making the software easy to use - which everyone should strive for, the novel thing we do is let users import those spreadmarts into the BI platform so they become shareable and more reliable. We also let users mashup internal and external data in the platform, which is a common reason for the xls in the first place. I think that's the direction vendors and internal IT need to go, remove frustrations with centralized BI, and make things easier for non-technical folks to get on board.

If anyone is interested in reading more on this data mashup topic and why it's good for business, I wrote a whitepaper on it at http://www.inetsoft.com/resources/prDM/evaluate/whitepapers/

Is this comment inappropriate? Click here to flag this comment.

Posted March 27, 2009 by David Loshin

Guys, thanks for both of your comments. Joseph, I share your desire to drive a means for managing one official version of the facts, but in many situations it is difficult, if not impossible to do when there is a difference between the "facts" and what everyone's perception of "truth" is. Even in the example you gave regarding sales tax, I am sure that there are some organizations that provide the data for an audit without themselves being completely convinced of its veractiy. But perhaps close to the facts is good enough in most cases.

 

John, I appreciate your suggestions as well, as long as managing the meta-meta-meta data doesn't get *too* out of hand. It can turn into a huge time sink with little perceived business benefit.

Is this comment inappropriate? Click here to flag this comment.

Posted March 26, 2009 by Joseph Subits

A few years ago, before the economy tanked and distracted those in the executive suite with more pressing matters, the notion that a mistake in the routine processing taking place in the production spreadmarts, that could lead to a materially relevant mistatement of the books, actually kept a few C-Level folks up at night.  In fact, one compliance software vendor has a great slide which cited 3 or 4 examples of significant fines and even jail time in one case because of a mistake made in a critical financial spreadsheet. With that much at stake, you start to wonder how most corporations can afford to NOT have one version of the facts.  Furthermore, after seeing the recent chaos in the world's financial institutions, maybe it is time for some sort of regulatory edict for all corporations to maintain an official version of the facts to some generally acceptable standard for all record keeping.  The closest that I have seen to anything like this are typical State sales tax audits.  A company must show that the sales amount for a particular period stated on their tax return ties out to the same amount in their consolidated general ledger, and local ledgers and customer invoice history data.  Most companies scramble to pull this together at the time of audit.  However, I'm willing to bet that at least one company out there can produce the entire balanced picture from their fully integrated and fiscally balanced enterprise data warehouse.     

Is this comment inappropriate? Click here to flag this comment.

Posted March 26, 2009 by John Mowbray

Loved it.More than 20 years ago, I had the same debates with IT operations and application delivery people about whether to manage distribution of mainframe data or just to continue to require access through the green screens.  And the work-arounds: hello screen scrapers and 3270 emulators!

I'll throw out a few other "offhand" thoughts that were triggered by the article.

Certainly the horsepower of computing has increased over the the years, but it seems the type of problem hasn't changed: faster machines, and newer software - targeting/tailoring more info for more workers at more levels.

I make the assumption that IT isn't the master of what and how user departments consume data; because it can't be - there's too much that changes too fast.

I believe there are a lot of things in the business and technology worlds that combine to make the user-xls approach the most effective (e.g. demonstrable/measurable costs and benefits, as well as local control of the requirements, solutions, delivery timing, and cost).

Since I'm a data/object modeller, I like the idea of rolling things up a level - e.g. generalization/inheritance.   It reduces the number of moving parts, and the reduced complexity lets me get my brain around more related bits - other types/classes of data, other sources and sinks, other patterns.  When I do that while mapping data element use to business processes,  I see each class of data has a many to many relationship with the real-row-level data elements in the DW/applications.  And each class has a many to many relationship with the uses-by-the-users.  It's hard to rationalize/document/control the many to many relationship of two many to many relationships.  So rather than controlling the individual rows to the individual use/users, why not co-operate on the management of the meta-data?  The data about the data, and about the process; about the people and the results.  Beauty.  MetaData Data Management

As you're pointing out though, there's a lot of xls-work that's carried out without involving the users' Business Analysis group (let alone the IT department's Systems Analysis group).  So targeting them for the MDDM may be the necessary first step.  Hopefully, there would be an expert group on each side - subject matter experts in data delivery and in data consumption.

Is this comment inappropriate? Click here to flag this comment.