We use cookies and other similar technologies (Cookies) to enhance your experience and to provide you with relevant content and ads. By using our website, you are agreeing to the use of Cookies. You can change your settings at any time. Cookie Policy.


Master Data Management Checklist #3: Maintaining the Enterprise Scope

Originally published July 30, 2009

In the purest sense of the phrase, as well as in the mindshare of most individuals who use it, “master data management” implies a single repository with a “golden copy” of each data subject area (e.g., customer, product or supplier). Seems pretty comprehensive, incorporating all of the existing copies of the data into that single shared data asset. However, despite the breadth of master data management (MDM) – or any similar enterprise program – a scan of the popular technical literature has many experts suggesting that you start small, either concentrating on one data subject area or consolidating one master data object type from a subset of the organization’s business applications.

The small start can be contrasted with a big one in which the analysts select a single master data object type, scan the entire enterprise to identify all applications that create or modify instances of that data type, and then attempt to populate the selected master data architecture. In the big approach, though, there is no mirage of progress until the end of the process. Starting small has benefits, especially when it comes to limiting the need for stakeholder buy-in or reducing the need for enterprise-wide agreement, standards and collaboration, but there are drawbacks as well.

Although consolidating two data sets is fundamentally simpler than consolidating many data sets, you must also realize that the incremental adjustments to the master repository resulting from each new data source will require incremental modifications to the models and incremental actions to be planned and performed. With the small start, your requirements focus on the limited number of data sets, and you widen the horizon as new data sources are introduced. However, each newly consolidated data source increases the likelihood of losing critical differentiating information associated with other data sources.

Recall that for each checklist, we provide some description and a number of questions, and an assessment would score the organization as follows:

  • If key stakeholders are aware of the item, questions like the examples provided have been asked, answered and a satisfactory result has been reviewed and approved by all the stakeholders, score GREEN.

  • If, despite the fact that you are already installing software, no one is aware of the item, or there is limited awareness but few or none of the questions have been asked or answered, score RED.

  • Limited awareness by varying degrees of stakeholders with some questions answered are scored with varying shades of AMBER.
In essence, the decision boils down to scoping. Master data management is strategic, and measuring progress by quantifying organizational change and maturity suggests that starting small takes away from reaching the long-term strategic goals. But demonstrating incremental value provides collateral for additional business buy in, and that is always good. Both aspects are necessary, and this checklist is intended to help define the scope of the master data management program as a prelude to developing a road map and planning the project phases. Here are some examples of questions that should be asked and answered:
  1. How many data subject areas are being considered for MDM?

  2. Is there a clear definition of what each data subject area entails?

  3. Are the project sponsors aware of the complexity of the data analysis, preparation and loading processes?

  4. Have you defined clear expectations for making progress in the data analysis, preparation and loading processes?

  5. For each data subject area:

    1. How many models exist for representing the subject area?

    2. How similar are the various models? What are the critical distinctions between the models?

    3. How many business processes create instances?

    4. How many systems create instances?

    5. How many business processes access or modify instances?

  6. How many systems access or modify instances?

  7. Does your organization have a data requirements analysis process that interviews all potential users of the data subject area?

  8. Does your organization have a data requirements definition process that accommodates the potential future use of created data as master data?

  9. Does the organization maintain standards for sharing data across business applications?

  10. Is there a process for identifying master data attributes?

  11. What are the identifying attributes for each data subject area?

  12. Is there a metadata discovery process for candidate data sources?

  13. Is there a data qualification process for candidate data sources?

  14. Is there a plan for the initial loading of the metadata asset?

  15. Is there a plan for incremental inclusion of new data sets into the master data asset?

  16. What are the quality criteria for certifying the master data asset?

  17. How often is the master data asset certified?

  18. Will existing business applications migrate to the master data asset?

  19. Will existing systems transition to the master data asset or use a replica of the master data?

  20. How often are replicas synchronized?

  21. What infrastructure is necessary for synchronizing replicas?

  22. Will the deployment consider multiple data subject areas at the same time, or migrate one data subject area at a time?
The challenge here is that the more you consider the ultimate objective of the fully unified source for all data subject areas, the more the scope increases. It is almost like a reverse Zeno’s paradox: with each step you take, the finish line gets further away. Of course, that will not end up in your favor when attempting to get the project funded. When scoping out the plan for MDM, perhaps it is wise to seek the middle ground – in other words, plan big and strategically; identify key measurable tactical, operational, and strategic benefits; plan tactical phases contributing to the strategic end-game; and execute against short-term expectations. For example, instead of merging two data sets using the data integration tools to create a third merged data set, consider planning a master data model that can accommodate the union of all instances of that master object type in the organization, but then have a sequenced migration plan to incrementally phase the inclusion of a smaller number of data sets into that master model over time.



Recent articles by David Loshin

 

Comments

Want to post a comment? Login or become a member today!

Be the first to comment!