We use cookies and other similar technologies (Cookies) to enhance your experience and to provide you with relevant content and ads. By using our website, you are agreeing to the use of Cookies. You can change your settings at any time. Cookie Policy.

Data Quality and Data Governance: The Basics

Originally published February 15, 2005

In last month’s article, we began discussing an organization’s path toward building an enterprise data hub. A critical part of this endeavor is the framework upon which an enterprise data governance and quality effort should be built—a data governance entity that is empowered by senior management, funded, accountable and “closed loop” (e.g., data quality issues are resolved at the source, not just cleansed via the process). The following chart depicts this framework at a high level.

Some points to highlight about this framework:

  • Data quality is not a separate, single activity. Data quality is incorporated into every aspect of the framework. The prime function of governance is to improve and maintain the quality of the data; thus, to be successful at governance, quality must be continuously measured and the results continuously fed back into the governance process.
  • Data quality does not equal data cleansing and scrubbing. Data quality is a much more involved process that focuses an organization’s resources on addressing the quality issues at the source vs. after the fact in the warehouses and analytic/reporting platforms.
  • Source systems and their data stores have been included in the framework. As anyone who has tackled the issue of data quality knows, it is far easier and cheaper to fix data issues at the source than rely completely on data cleansing and scrubbing. It is for this reason that our framework explicitly links source systems and files to the data governance entity. Ownership and accountability for the quality of the data must reside with the business owners of the source systems. Now for a dose of reality. Let’s understand that some may feel that it's naive to think that all or even any data quality issues will get solved at the source when in a large enterprise there may be multiple conflicting sources, and often the real challenge is getting different divisions to agree to use a single source system. Thus, it is usually smart to build rigorous structures in a data quality management architecture to verify &remedy source data quality and certify target analytic data's quality.
  • Technology is not explicitly highlighted as a separate component either. Technology is an absolute requisite to ensure the success of any governance and quality effort and is, therefore, integrated within the entire framework. 

Let’s start our look into data governance and data quality at the top with the data governance entity. To be successful, data governance has to be more than a collection of ad-hoc data quality projects. It is, therefore, imperative that a data governance structure be formed that will ensure that authority is delegated from the senior-most levels of the firm to the appropriate parties and that these parties be held accountable to execute against their respective mandates. While there are several possible governance structures, a common theme that runs across them is the segregation of activities and responsibilities into layers—strategic, tactical and execution.


Key Responsibilities


Ratify/modify data management principles

Ensure on-going funding is available

Identify opportunities and issues

Understand costs and benefits

Define priorities

Monitor progress



Execute to priorities of the strategy

Ensure the availability of processes and infrastructure

Focus on coordinating tactical delivery

Leverage existing implementation efforts or initiate separate projects

Manage and report opportunities and issues

Analyze costs; monitor, track and report on progress against goals and objectives



Implement projects as defined by tactical component

Educate developers, end-users, etc., on data standards and the importance of data quality

Audit (sampling and monitoring) of data quality to ensure compliance against standards for both internal and external data

Participate in system-related projects to ensure standards (data model, metadata, etc.,) are incorporated in development/enhancement


The following is but one example of an organization structure. It is presented here not as a recommendation for all institutions, but as an illustration of how a data governance organization structure can appropriately involve executive management.

Sample Governance Entity


Since the concept of enterprise data governance is new to many organizations and since key components of data management are not well established, questions abound regarding the structure and responsibilities of the data governance execution layer highlighted above. One of the most common questions is, “How does real work get done within this structure?”

Let’s look at one segment of data. Enterprise risk management is emerging as a major issue within most financial institutions and is VERY data-centric. Let’s use this issue as an example to demonstrate both structure and responsibilities. The following is a generic functional organization chart and key responsibilities for such a group:

Risk Data Management Group

(Sample organization/responsibilities at the data governance execution layer)

Key activities/responsibilities of the risk data management group would include:

  • Establish and maintain risk data standards for:

              - Data warehouse architecture

              - Business intelligence (BI) and reporting architecture, and

              - Quality assurance and release management

  • Build and maintain a Basel II compliant enterprise risk data model, which includes authorizing future database (marts, etc.) development that has data needed to ensure that they use/extend the data model
  • Build and maintain the meta data repository

             - Includes technical and business definitions of risk data and processes

  • Build and maintain enterprise-wide risk reference data repository, which covers risk definition of customer, industry groups, credit grades, etc.
  • Set up and run data quality routines to continuously measure, identify root causes, alert, report, audit (and systematically correct) data quality issues
  • Build and maintain risk data documentation and artifacts
  • Interface with business

              - Problem identification/resolution

              - End-user support, including data access, BI tool support, etc.

              - Teach and preach

              - User certification

  • Interact with risk governance body

               - Periodically review standards, best practices and aid in establishment of enterprise-wide risk data policies and procedures

                - Perform periodic data quality impact analysis

                - Develop data quality solution proposal and ROI business case for the proposed solution alternatives

The risk data management group would serve as the execution arm in the data governance structure proposed earlier. The head of the risk data management group would serve on the data council.

By forming a risk data management group like the one above, the necessary focus would be put on the effort to build an enterprise risk data environment. By placing this working group within the overall data governance entity discussed earlier, all of the appropriate resources of the institution can be brought to the effort. Resources and priorities can be set and managed with the advice and counsel of executive management. Such a structure neatly “attaches” the working groups with executive management and overall institution strategies.

In next month’s article, we will address practical approaches to data quality and discuss how to achieve measurable results.

  • Duffie BrunsonDuffie Brunson

    Duffie is a Senior Principal for Financial Services at Knightsbridge Solutions. With more than 30 years of experience in financial institutions as both a banker and consultant, he has been involved with leading-edge developments within the industry, including the creation of the automated clearinghouse, the debit card, in-home transaction services, co-branded credit cards, electronic payment networks, financial advisory/planning services and integrated customer data warehouses.

    Duffie holds an undergraduate degree from the University of Virginia and an MBA from Georgia State University. He is a graduate of the Seidman Auditing School at the University of Wisconsin, and the Stonier School of Banking at Rutgers University. He has served as a member of the ABA's Operations and Automation Quality Council, as a faculty member of the Graduate School of Banking at Colorado, and as a lecturer at the Universita' Cattolica del Sacro Cuore in Milan, Italy.

    Duffie can be reached at dbrunson@knightsbridge.com.

    Editor's note: More financial services articles, resources, news and events are available in the Business Intelligence Network's Financial Services Channel. Be sure to visit today!

Recent articles by Duffie Brunson



Want to post a comment? Login or become a member today!

Be the first to comment!