We use cookies and other similar technologies (Cookies) to enhance your experience and to provide you with relevant content and ads. By using our website, you are agreeing to the use of Cookies. You can change your settings at any time. Cookie Policy.


Blog: David Loshin Subscribe to this blog's RSS feed!

David Loshin

Welcome to my BeyeNETWORK Blog. This is going to be the place for us to exchange thoughts, ideas and opinions on all aspects of the information quality and data integration world. I intend this to be a forum for discussing changes in the industry, as well as how external forces influence the way we treat our information asset. The value of the blog will be greatly enhanced by your participation! I intend to introduce controversial topics here, and I fully expect that reader input will "spice it up." Here we will share ideas, vendor and client updates, problems, questions and, most importantly, your reactions. So keep coming back each week to see what is new on our Blog!

About the author >

David is the President of Knowledge Integrity, Inc., a consulting and development company focusing on customized information management solutions including information quality solutions consulting, information quality training and business rules solutions. Loshin is the author of The Practitioner's Guide to Data Quality Improvement, Master Data Management, Enterprise Knowledge Management: The Data Quality Approachand Business Intelligence: The Savvy Manager's Guide. He is a frequent speaker on maximizing the value of information. David can be reached at loshin@knowledge-integrity.com or at (301) 754-6350.

Editor's Note: More articles and resources are available in David's BeyeNETWORK Expert Channel. Be sure to visit today!

Recently in Governance Category

I came across a few articles last week talking about easing the cost of compliance for small and medium companies for Sarbane Oxley.  This article from the New York Times comments that "the House Financial Services Committee moved to permanently exempt companies worth less than $75 million from the auditing provisions of the Sarbanes-Oxley Act."

Despite its being touted as a measure to ease the financial burden, I have reservations about this for two reasons. First, eliminating the auditing provision essentially removes any capability to ensure investors that there are processes in place to verify that the financial data meets specified compliance criteria. In turn, this opens the door for noncompliance and places the burden on the shareholders to force the company to be honest about the company's finances.

Second, it eliminates the need to institute a key best practice for data quality management - transparent inspection and monitoring of enterprise data. As a data quality practitioner, I am disappointed that the government is stepping away from mandated data quality management and data governance.


Posted November 13, 2009 6:48 AM
Permalink | No Comments |

So far I have seen a number of environments that have paid lip service to metadata as the be-all and end-all to solving all enterprise data issues and solidifying all enterprise data management needs. The reality seems to be that there is a lot of value for metadata in a number of instances although the value proposition for the investment in a full-scale implementation still seems to be lacking somewhat.

Some basic implementations cover data entity definitions, structures, and corresponding data element definitions and structure as well. Yet often the metadata repository is largely uni-directional, acting as a sink for data definitions etc., but having no "active" componentry that feeds back to the consuming applications.

The upshot is there is a need for a continuous investment in maintenance. However, those situations showing the criticality of metadata are those where the systems are changing - modernizations, migrations to ERP, MDM implementations. In essence, these are the places where the current system is being trashed and the data needs to move to a new system.

This is a true conundrum - there is a need to maintain the metadata (and a corresponding investment) while the systems are in use in preparation for their retirement. While the systems are in production, the metadata is not in great demand (since things are typically not going to change too much). This lowers the perceived priority of metadata management.

You do need it when you are changing things. Therefore you are going to not just throw out the existing system, but its reliance on the existing documented metadata. Therefore, the return is limited because you have invested a huge effort in maintaining something you about to retire. But I do need metadata when I am going to migrate data so I know what I have to work with.

And yet, metadata management is an indicator of good data management practices, and is likely to coincide with good system development and maintenance practices, lowering the need for system modernization.

So metadata is needed usually when I don't have it and is not needed when I do have it.

On top of that, the effort to maintain discrete information about the thousands (if not tens of thousands) of data elements used across an organization is gargantuan, which also limits the utility of a metadata resource 9since it will take forever to collect all the information).

The answer has got to be somewhere in between - "just enough metadata" to support existing application needs (for improvements and upgrades to functionality) and enough to support the processes needed to retire the applications and design their replacements.

Anyone have any experiences that can support this view? Post them!


Posted September 15, 2009 7:53 AM
Permalink | 1 Comment |

As a by-product of some of our current activities in data governance, I was interested in looking at ways that people model performance metrics. Interestingly, half an hour's worth of web searching turned up surprisingly few artifacts that describe ways to model a performance metric. Perhaps my search term vocabulary is artificially limited to the phrases I believe should provide some hits, since I am confident that every BI tool vendor has embedded models for performance metrics.

However, the failed search exercise has triggered the dreaded next step: having to think about it myself. My first thoughts revolve around "metric basics":

- who are the stakeholders,
- what are the performance objectives,
- what is being measured,
- what are the units of measure,
- how is the measurement performed,
- how often is the measurement done,
- is the measurement process automated or manual,
- how is the result reported,
- how are individual measurements rolled up into more comprehensive scores,
- what are the benchmark values,
- what are the critical thresholds,
- who is notified of a low score,
- how are issues forwarded into the issues tracking system.

Any other suggestions?


Posted July 31, 2007 7:23 AM
Permalink | 1 Comment |

My company has been involved in a lot of data governance work recently. Two of the mian drivers are regulatory compliance and consistency in reporting (which often rolls back to compliance). Interestingly, in some of the client industries, fraud detection seems to be an additional driver. This is a little curious to me. On the one hand, fraud detection fits into the compliance framework - looking for non-conformance to business policies. In both cases, we essentially identify critical policies, rules that indicate conformance to those policies, and generate alerts when those policies are violated.

The difference is that compliance is introspective while fraud detection is outward looking. Compliance seeks to guard your own behavior, looking for how the organization is living up to everyone else's expectations. Fraud detection is outwardlooking, seeking to figure out how your own rules are being transgressed by others.

I can imagine another significant difference - fraud is performed proactively, with the perpetrators intentionally trying to avoid detection. Compliance issues are potentially intentional, but inadvertent non-compliance is certainly targeted by control processes.

This raises a different business challenge: it may be possible that there are corporate business policies that conflict with externally-imposed regulations. If so, does the issue of compliance change from self-policing to weighing the risk of noncomplaince with the risk of getting caught? And if the latter is the case, it suggests that internal governance programs are "window-dressing," especially when the real (i.e., intentional) transgressions are going to be well-hidden.


Posted May 13, 2007 5:46 PM
Permalink | No Comments |

1 2 NEXT

   VISIT MY EXPERT CHANNEL

Search this blog
Categories ›
Archives ›
Recent Entries ›