We use cookies and other similar technologies (Cookies) to enhance your experience and to provide you with relevant content and ads. By using our website, you are agreeing to the use of Cookies. You can change your settings at any time. Cookie Policy.


Master Data Management Checklist #5: Data Quality Mechanics

Originally published October 29, 2009

Data quality has emerged as a strong motivator for master data management. The desire for consistency and accuracy of enterprise data (especially customer and product information) has finally grabbed the attention of the right level of management to enable at least a preliminary commitment to establishing good data management practices. In fact, master data management (MDM) is firmly rooted in data quality techniques – many MDM activities have evolved out of data cleansing processes needed for data warehousing, business intelligence or data migrations as new systems are introduced and brought online.

The ability to use the traditional data quality toolset of data parsing, standardization and matching enables the development of a “customer master,” “product master,” “security master,” etc. that becomes the master entity index to be used for ongoing identity resolution and elimination of duplicate entries. In fact, the realization that the entity index itself represented a valuable information resource was a precursor to the development of the master repository and the corresponding services supporting master data management.

Master data management is both driven by and reliant on high quality data from across (and from outside of) the enterprise. The hope is that data is extracted from many different sources, parsed, cleansed, matched, linked, and boiled down into that unified view; and, at the conceptual level, the quality transformations are critical to ensure trustworthiness as the data is consolidated into the master data environment. But from an operations standpoint, there are bound to be slight, and perhaps even significant, variations between the data sources, either with respect to core definitions, meanings, formats, structures, representations and presentation of the data elements that are embedded within the source models prior to consolidation into that unique master data object.

The challenges of ensuring data quality within the MDM environment are those associated with identifying critical data elements, determining which data elements constitute master data, locating and isolating master data objects that exist within the enterprise, and reviewing and resolving the variances between the different representations in order to consolidate instances into a single view. Even after the initial migration of data into a master repository, there will still be a need to instantiate data inspection, monitoring and controls to identify any potential data quality issues and prevent any material business impacts from occurring.

So this month’s checklist focuses on data quality tools and techniques used to ensure the quality of the consolidation process and maintenance of the master data. Recall that for each checklist, we provide some description and a number of questions, and an assessment would score the organization as follows:

  • If key stakeholders are aware of the item, questions like the examples provided have been asked, answered and a satisfactory result has been reviewed and approved by all the stakeholders, score GREEN.

  • If, despite the fact that you are already installing software, no one is aware of the item, or there is limited awareness but few or none of the questions have been asked or answered, score RED.

  • Limited awareness by varying degrees of stakeholders with some questions answered is scored with varying shades of AMBER.

Checklist #5: Data Quality Mechanics

  1. Is there a data quality program in your organization?

  2. Who is the data quality manager?

  3. Is there a process for data quality assessment?

  4. How often is the quality of data sets assessed?

  5. What is the process for reacting to a discovered data error?

  6. How are business needs determined for data quality tools?

  7. Does your organization perform data standardization or cleansing on a regular basis?

  8. If so, how often is that process executed?

  9. Does your organization have a data enhancement process?

  10. If so, how often is that process executed?

  11. Does your organization have a data profiling tool?

  12. How is the data profiling tool used?

  13. Does your organization have a data monitoring tool?

  14. Is automated data validation performed?

  15. Are the results of data validation posted to a data quality scorecard?

  16. What data quality tools are available for use?

  17. How many data quality vendors provide solutions and/or services?

  18. Are there training sessions for the use of data quality tools and techniques?

  19. What staff members are trained in the use of data quality technology?

  20. Is there a data quality incident reporting and tracking tool in use?
Holistic data quality assurance, including data assessment, parsing, standardization, identity resolution, enterprise integration, auditing and monitoring – in fact, all of these aspects of master consolidation – rely on data quality tools and techniques successfully employed over time for both operational and analytic purposes. And master data integration cannot be done properly unless the right kinds of tools and technology are available. When it comes to selecting data quality tools, make sure that a proper business needs assessment has been performed and that clear guidelines are specified for evaluating and selecting a tools provider. In order for applications to rely on high quality master data, the MDM team must ensure that the data absorbed into the master repository is of the highest measurable quality, so make sure that you arm yourselves with the most appropriate tools for the job.


Recent articles by David Loshin

 

Comments

Want to post a comment? Login or become a member today!

Posted November 3, 2009 by Ken O'Connor kenoconnor00@gmail.com

David,

Excellent article, well worth reading. Thank you also for the checklist.

I like the traffic lights approach to scoring an organisation on its data quality processes.

I take a similar approach when assessing Enterprise Wide Data Governance issues, and I can see myself using your checklist as part of that process.

For details of the process I use, see: 

Process for assessing status of common Enterprise-Wide Data Governance Issues

Rgds Ken  

Is this comment inappropriate? Click here to flag this comment.