We use cookies and other similar technologies (Cookies) to enhance your experience and to provide you with relevant content and ads. By using our website, you are agreeing to the use of Cookies. You can change your settings at any time. Cookie Policy.

Business Drivers and Master Data

Originally published February 26, 2009

Over the past month, I have done an informal “survey” of promotional material regarding master data management (including various vendor websites, my own book and other books) with the intention of understanding how the value proposition is communicated to potential technology adopters. Common concepts and phrases that appear frequently include:

  • 360-degree customer view

  • Single source of truth

  • Golden copy

  • Reduces complexity

  • Enables service-oriented architectures (SOA)

  • Increased revenue

  • Enables regulatory compliance

  • Improved data quality

  • Improved collaboration

  • Consistency of definition

  • Enable cross-selling and up-selling

  • Operational efficiency

I won’t deny that any of these are good results (they are certainly good sound bites), nor that master data management (MDM) will enable those results. On the other hand, one might consider whether this list is a little too generic when it comes to establishing a reasonable value proposition, especially once you consider the level of effort that may be necessary in design, development and deployment in order to achieve these goals. Before traipsing upstairs with the funding proposal, it might be worthwhile to invest some thinking time to determine where the real business benefits are, and the degree to which MDM is the appropriate solution. In essence, this introduces two trains of thought:

  1. Business process requirements for master data

  2. Level of effort for master data integration

Often, the MDM solution, like all other potential silver bullets, is proposed as a technical solution to a potential technical problem. Multiple copies of data, inconsistency of definition, operational inefficiency, etc. are artifacts of systemic application design and implementation (or, more likely, the lack thereof). The intention of MDM is to “clean up” the unintended replication and redundancy by collapsing lots of stuff into a single repository, a “master” version shared by all the participating applications and corresponding business processes. The premise is that through consolidation the master copy will be of the highest quality, which will best serve everybody.

However, this may be a case of the tail wagging the dog. Is the actual business need for a single version of the data, or just multiple versions, each of which is of higher quality? Drill down into this a little bit and you may need additional information from your business customers. What constitutes a requirement for master data? A situation in which two business processes need to have a fully shared view of the same representation of a data item? Even if we use this working definition, it takes someone with horizontal eyesight to cut across the vertical application infrastructure inherent to most organizations, since every business user will reflect his or her process’s needs – it is up to the MDM analyst to figure out where the overlap is, if there is any at all.

This leads to the second train of thought – the level of effort for master data integration. I am not talking about “integrating” data into the master repository, but rather “integrating” data out of the master repository. In simple environments, the MDM system’s service layer provides a means for application use of the data in the repository, but even this requires a retooling of the application to transition from using its own data silo to using the master data service layer or the construction of extraction and transformation routines to move data in and out of the master repository. This also presumes that the semantics associated with any particular master data object are aligned between the application and the master copy, which suggests a need for a dose of metadata management as well. To what extent is it necessary to adjust existing applications to transition to using the master repository? How much effort is involved?

This brings it back to the question of business requirements and MDM. If we look at each of the proposed benefits listed earlier, it is necessary to ask a more fundamental question. Take that benefit and insert it into this question: To what extent is the business process improved by having (fill in the blank)? If there is a reasonable answer to that question, ask a different question: Can we achieve (fill in the blank) with a less comprehensive approach?

Here is an example: An organization would like to a consistent view of its products to ensure that the price offered by various sales channels for each product is always the same. The proliferation of different copies of the product catalog across different sales channels (telesales, brick & mortar, web sales) means that prices are inconsistent depending on how the customer found the product, which means that the company is essentially competing with itself for product sales. The business driver is establishing consistent product pricing data to eliminate variance and the costs (e.g., matching to the lowest price) associated with the issue.

Okay, to what extent is the business process improved by having consistent product pricing data? This means assessing the actual business impacts associated with inconsistent pricing data. What are the cost categories? What are the business impacts? How often does this occur? What is the scale of the price differences? If this happens once out of 200 product prices and the difference is a few pennies, this may add up, but does it add up to the level of effort for instituting MDM?

Okay number 2: A master repository for product data will address this by providing a single source for the product catalog. Can we achieve consistent product pricing with a less comprehensive approach? The answer may be that improved product description matching can provide a daily list of inconsistent prices, which when coupled with a set of manual pricing updates to the dependent systems will result in consistency across the systems.

Now compare that effort and the effort to institute the MDM system. Each plan of attack has staff requirements, different expectations for level of knowledge and skill, different deployment tactics, different longevity – many different criteria. Calculate the total cost of operations for both and see how cost effective each is with respect to the scale of the problem.

This is just a simple example, but the concern behind the concept is sound: when a technology that is touted as a general solution overwhelms the problem, it poses a risk to good practices in the important situations where the technology truly is the solution. This means that considering whether MDM (or any other technology!) is the best solution requires focusing on addressing the business drivers above all else, and considering the different options to assess feasibility and potential for success. More on this to follow in coming months.

Recent articles by David Loshin



Want to post a comment? Login or become a member today!

Be the first to comment!