Blog: Claudia Imhoff Subscribe to this blog's RSS feed!

Claudia Imhoff

Welcome to my blog.

This is another means for me to communicate, educate and participate within the Business Intelligence industry. It is a perfect forum for airing opinions, thoughts, vendor and client updates, problems and questions. To maximize the blog's value, it must be a participative venue. This means I will look forward to hearing from you often, since your input is vital to the blog's success. All I ask is that you treat me, the blog, and everyone who uses it with respect.

So...check it out every week to see what is new and exciting in our ever changing BI world.

About the author >

A thought leader, visionary, and practitioner, Claudia Imhoff, Ph.D., is an internationally recognized expert on analytics, business intelligence, and the architectures to support these initiatives. Dr. Imhoff has co-authored five books on these subjects and writes articles (totaling more than 150) for technical and business magazines.

She is also the Founder of the Boulder BI Brain Trust, a consortium of independent analysts and consultants ( You can follow them on Twitter at #BBBT

Editor's Note:
More articles and resources are available in Claudia's BeyeNETWORK Expert Channel. Be sure to visit today!

Where is this coming from? Gartner recently “predicted” the 50% of data warehouse projects will have limited acceptance or be out right failures. Why? And what fact is this dire prediction based upon?

Apparently Gartner has turned its skeptical eye onto BI -- again. In 2003, Gartner predicted that over 50% of data warehouse projects would fail... That time, they said that enterprises will fail to use BI properly, losing market share to those that implement and leverage BI correctlyI guess they have decided that the same story can run again. No real evidence of why or even if this prediction was close.

Now two years later, Gartner is making the same claim. As for why, Ted Friedman, a principal analyst with the company, claims it will be due to poor data quality. He suggests that the enterprise should create a data quality firewall – something to “sniff out data quality issues that are coming from your suppliers.” My, but he has rather negative statement that is shy on facts – at least very few were given -- again.

The most recent CIO Magazine just published a story on the remarkable success that the food industry – especially fast food – has had with BI analyses. Of course, they too start their article off with a statement that BI has not had the best success rate. Again no evidence or examples were given.

Sure makes you wonder where these people get their information. Call me a Pollyanna or cockeyed optimist, but the clients I work with have had remarkable success with their projects. I have worked with very few companies that have had less than successful implementations. I welcome your input as to the success of your BI implementations and the reasons for it. Let’s see if we can get some real information on the SUCCESS rates of BI to counter this negative reporting.

Yours in BI success!


Posted March 21, 2005 6:02 PM
Permalink | 8 Comments |


Claudia - I recently attended the Gartner BI Seminar in Chicago (March 2005). I am happy to report that the focus of the conference was very positive, and there were many case studies of various companies BI success stories. I may have seen a presentation slide or two which referenced the "50% failure rate". My opinion is that those presenters need to update their old slides! None of the participants I met during the conference expressed that pessimistic assessment of their BI projects; and that certainly isn't representative of our BI initiatives.
Bill Ennis, Data Warehouse Manager,
Ohio Dept. of Job & Family Services
Columbus, Ohio.

Hi Claudia,
well, when I was selling in Australia I did my very best only to sell to clients who would admit they had failed at least twice in an effort to deliver on BI. And it was a target rich environment!!!

There were plenty of big companies wasting plenty of money on fruitless BI efforts with government departments leading the way. Claims of failed projects are no surprise to me. I've seen plenty, and sad to say, not even all projects I have worked on have been successful.

More importantly, I too have made many efforts to get people, business people, talking about 'what works for the business with BI' rather than techo feeds and speeds. I even started but to no real result to date.

Projects I have worked on have produced millions of dollars of new profit for my clients. I have always believed more 'success stories' and more 'how to increase profitablity using BI' stories would be of benefit to the business community. Alas, it seems, few people are interested. Forums and lists fill up with 'feeds and speeds' long before I see any comments on 'business benefits realisation'.

Best Regards

Peter Nolan

I don't know if I agree with the 50% claim.
However, I have seen a lot of failures being claimed as "success" upon implementation. But, a revisit after a few months with the business sponsors of the projects shows the utter failure.
A majority of these failures are due to data quality issues. Currently, I am consulting at a Fortune 50 company which has followed the Kimball approach and created "Sales Data Mart", "Inventory Data Mart", "Purchasing Data Mart" and now the enterprise wants a consolidated view. In theory, it should be as easy as plug and play the "Bus architecture" , but the fact is that the business sponsors do not trust the data in these marts.

I absolutely agree with Gartner & co. in creating a "Data Quality firewall" before the data warehouse.

Pritesh Desai
Data Architect
Columbus, OH

The claim of poor data quality for data warehouse failures is correct but at the same time there are other important factors that needs to be considered here as well. You will have to look into the architecture, design, methodologies, standards, best practices, data integration and quality checks, and overall strategy and roadmap that lead to potential success or failure. Data Quality is always considered to create mayhem and is at the center of it all because the bottom line is at the end it is the data that gets delivered and measured. If the above said approachs are done incorrectly, then the quality of the data also suffers leading to failures.

Andy Vaidya
Chief Architect

>> I appreciate your view, but unfortunately I disagree because of so many less than successful Data warehouses (DWs) I've seen. Please excuse me if I have written too much here, but this is hot topic with me.

>> I have been heavily or lightly involved on the reporting side of approx 15 data warehouses over the past 10 years. I have seen excellent business value realized with some and very little with others. One of the DWs had the CEO log in everyday including Sundays to see how the business was doing through customized dashboards. Others were collecting data and not providing the business users a good way to access the data. And others were being re-built again and again.

>> Most of them were in their 2nd or 3rd iteration tying to get it right. I would consider a 3rd iteration finally being implemented right a 67% failure rate. Oh how so much money can be wasted on these efforts! I've even seen where 3rd iterations and later are still poorly designed. I agree with an earlier poster that some DWs are mediocre at best but are brandished as huge successes by a few high-level IT execs. You need to talk to the business users to discover the real truth, and sometimes even they have very low expectations due to poor implementations in the past. I actually feel sorry for most business users who have to rely on poor DWs to get data for them to do their jobs. So many times, I hear IT staff talk about dumb users, when I believe the business users are much brighter than most of the IT people working on DWs.

>> The success of a BI initiative invariably boils down to a few key factors: 1) heavy business user involvement, 2) appropriate technical design and implementation, and 3) management of the entire effort. I consider poor data a very poor excuse as efforts can be re-directed to fixing the data entry points. Many times the data gets blamed, but it is the design or the mgmt, which caused the poor data. It is politically easier to blame the data than to shed light on the real problems. If any one of these factors are a weak link in the chain, the project will not be a real success.

>> I've worked with some sr. architects from well-known firms, who talk Kimball-talk to users and fellow IT workers, but still don't really understand it as they continually make poor tactical decisions during the design and development stages. It's really up to management to find key people to lead the projects (at least 1 very good architect is a necessity). It is all too easy for less capable technicians to find a way to blame the data.

>> If non-technical managers have less than capable tech leads, then the project will be a failure because the DW will not be implemented correctly and the manager will not realize this until it is too late. Only when a very capable and experienced technical lead is in charge, does the DW have a chance of REAL SUCCESS. Even then, rhere are so many obstacles to really being successful with one of these efforts, that the odds are still against him or her. I believe the real rate of success is closer to 25% if you count earlier iterations as failures. If you don't count the earlier iterations, I would guess about 50% in my experience. I believe the bar is set too low for most DWs that are considered successes. Unless you have dashboards for the execs, operational reports for those who need them, a simple and easy interface for the business analysts and managers to pull their own data any time of the day, and business users eyes lighting up with big smiles when they hear the words data warehouse, I would not consider the DW a major success.

>> But that's me. I'm kind of a perfectionist when it comes to implementing these huge technical efforts. Sometimes, co-workers may express to me that good enough is good enough on many of the processes. But when you have a hundred good enoughs strung together, you have what amounts to a paper house or paper chain that can easily break. Management must understand this and put procedures and processes in place to ensure that everyone is working to achieve the same goal and to ensure that everyday tactical decisions are being made appropriately.

>> It's a huge effort to manage, design, and implement ships of this size, and unfortuantely they often go off course.
Don Lacopo
Business Intelligent Architect/Manager
DJL Technology

I also feel that the data quality issue is boolean in nature - either you can fix it or its technically impossible. To me it looks like a problem that can be fixed.

Within my professional career, I have found the lack of understanding of the business is another reason for failure. Most of the time the vendor doing the implementation fails to understand the key questions that decision makers have. This is one area, the Analytics / BI industry as whole should work towards.

Amit Gupta

In the December 2005 Cutter IT Journal, available in PDF via , Ken Collier wrote a wonderful article about testing issues in DW environments. If you're looking for a data quality firewall strategy, this article probably provides a good start.

His article was one of five articles on the topic of agile database techniques. The issue might be an eye-opener for some people.

- Scott

Failure is a relative term. I guess when a datawarehouse doesn't meet 100% of its stated objectives, management might tend to classify it as a failure even though 80% of the objectives might have been met.
I have had the opportunity to work on Data mart based datawarehouses as well as Enterprise wide datawarehouses. What I have found is that in the datamart approach, in the intial stages when specific functional marts are being built, users are ecstatic with the results for a specific domain.But as the other marts get built and mature users start looking for cross functional data, they are severely disappointed by the lack of cohesion as well performance issues.
On the other hand in the EDW environment, the initial stages are so slow that there is only a trickle of users who are interested.But as the warehouse grows, there is a proportionally growing interest in the user community and users tend to be happy as the datawarehouse matures.
As far as data quality is concerned, we would all like to have the cleanest possible data, but it comes with a price and performance overhead.
With a moderately clean datawarehouse, the user looking for pinpoint accuracy is bound to be disappointed, but a user who gets results out of long term trend might be more than satisfied.

So it depends on what stage or type of users Gartner surveys to get its 50% failure rate.

Rajesh Arumughan,
Wipro Technologies.

Leave a comment


Search this blog
Categories ›
Archives ›
Recent Entries ›