We use cookies and other similar technologies (Cookies) to enhance your experience and to provide you with relevant content and ads. By using our website, you are agreeing to the use of Cookies. You can change your settings at any time. Cookie Policy.

Blog: David Loshin Subscribe to this blog's RSS feed!

David Loshin

Welcome to my BeyeNETWORK Blog. This is going to be the place for us to exchange thoughts, ideas and opinions on all aspects of the information quality and data integration world. I intend this to be a forum for discussing changes in the industry, as well as how external forces influence the way we treat our information asset. The value of the blog will be greatly enhanced by your participation! I intend to introduce controversial topics here, and I fully expect that reader input will "spice it up." Here we will share ideas, vendor and client updates, problems, questions and, most importantly, your reactions. So keep coming back each week to see what is new on our Blog!

About the author >

David is the President of Knowledge Integrity, Inc., a consulting and development company focusing on customized information management solutions including information quality solutions consulting, information quality training and business rules solutions. Loshin is the author of The Practitioner's Guide to Data Quality Improvement, Master Data Management, Enterprise Knowledge Management: The Data Quality Approachand Business Intelligence: The Savvy Manager's Guide. He is a frequent speaker on maximizing the value of information. David can be reached at loshin@knowledge-integrity.com or at (301) 754-6350.

Editor's Note: More articles and resources are available in David's BeyeNETWORK Expert Channel. Be sure to visit today!

January 2006 Archives

As might have been predicted, last week Informatica announced that it was acquiring UK-based Similarity Systems, a provider of data quality tools. Similarity, itself recently in the acquisition business, must have long appeared to be a suitable candidate for Informatica, having watched IBM swallow up DQ and ETL tools vendor Ascential (hear more about that one in my interview with Scott McNabb!) early in 2005. As I discussed back when Firstlogic announced their (later aborted) sale to Pitney-Bowes, it has become fashionable for information movement companies to embed a data quality solution.

The announcement is good news for both Similarity and for Informatica. Similarity Systems is probably the current thought leader among European Data Quality tools vendors, having adopted a more process-oriented, full-cycle approach to information quality improvement. Informatica, long able to provide a full data quality solution, now acquires some of the pieces missing from its repertoire, along with additional focused expertise and potential for some degree of market expansion.

The next question: how will the acquisition affect Informatica's relationships with Firstlogic and Trillium?

Posted January 30, 2006 11:13 AM
Permalink | No Comments |

More controversy swirls around the new Medicare regulations. According to this grand Forks Herald article, bad data associated with Medicare customer systems has resulted in a cost of over $2,000,000.00 to the state of Minnesota. According to the story, "In many cases, the massive and nationwide array of computer problems, bad data (emphasis mine) and overwhelmed Medicare and drug-plan phone lines prevented pharmacists from verifying that a customer was eligible for the deep subsidy - and sometimes unable to find out if that customer even was enrolled in a drug plan."

Two problems reported: bad data and inability to find out if that customer was enrolled.

The first problem is left unspecified, as if it is clear what makes the data bad. The second indicates a less-than reliable master customer repository, which suggests that quality expectations were not well-specified before the law changed.

What is interesting, though, is that this is a good example of cost impacts that are both quantifiable and attributable to poor data quality (whether it be the nebulous "bad data" or the more precise master data management failure). The simple costs are the gaps in coverage, such as the $2.2M that MN is paying the pharmacists for the 38,000 claims (which works out to a little less than $60.00 per claim). The more important, yet more difficult to quantify cost involves the loss of confidence in the ability to provide low-cost medication to the people who need it most.

Moreover, the confusion doesn't really end there. In this article from the Seattle Times, there is talk of failure in communication and information exchange regarding covered medications, enrollment, administration, and workflow bottlenecks.

So here is my last comment, which is intended to reflect on those who constantly ask me for examples of ROI models for data quality. Had everything gone smoothly, with no data issues, there would not have been many incurred costs other than those to ensure high-quality data, which paradoxically implies that there is no measurable return on that investment. Perhaps we should stop trying to use ROI models and consider the fact that good planning and vigilance might provide ample, yet unremarkable, rewards?

Posted January 23, 2006 8:02 AM
Permalink | No Comments |

According to an analysis done by US Pharmacopeia and reported in the Washington Post , "Medication errors that harm patients are seven times more frequent in the course of radiological services than in other hospital settings."

According to US Pharmacopeia's John Santell, "Many of the errors resulted from communication breakdowns, the researchers found, such as passing on incorrectly the dose or name of the drug being administered, or one worker failing to inform another about other drugs a patient was taking. The most common errors were patients getting the wrong dose or drug, failing to get the drug they should have had or having the drug administered incorrectly."

The existence of communication breakdowns as part of the operational (no pun intended) processes within a health environment raise the question of whether "electonifying" or automating the exchange of patient information might allow for the introduction of validation rules (or workflow requirements for accountability signoffs) into the process to identify potential drug administration errors before they occur. In addition, logging all actions associated with moving a patient through a particular medical process within an automated system might also help in accurately capturing "what really happened" to help with remediation of critical errors if they do slip through.

Anyone familiar with health care workflow automation that could help in this situation?

Posted January 19, 2006 6:22 AM
Permalink | No Comments |

I have started to take a good look at Google Earth. I have long been a believer in the potential of using geographical information systems as part of a business intelligence program. One task I am setting for myself is to see whether this new (and extremely mesmerizing) tool from Google can be easily integrated as a visualization layer for analysis and reporting. Apparently you can add your own layers into the views, which might provide some pretty cool conceptual representations regarding spatial analysis.

Let me know if you have tinkered with this at all...

Posted January 12, 2006 7:51 PM
Permalink | 1 Comment |

A few weeks ago I posted an entry about Google's offering their analytics package free, and how the overwhelming response flooded their ability to continue to provide new service accounts. Well, six weeks later, they are still laboring to meet the demand. Not only that, the service does seem to be plagued by intermittent issues (response time, functionality, etc.), most pointedly in the area of customer support.

Hey, all you analyst experts: there may be a business opportunity in developing a third-party service to help get Google Analytics users up, running, satisfied, and enlightened...

Posted January 6, 2006 7:19 AM
Permalink | No Comments |
PREV 1 2


Search this blog
Categories ›
Archives ›
Recent Entries ›