We use cookies and other similar technologies (Cookies) to enhance your experience and to provide you with relevant content and ads. By using our website, you are agreeing to the use of Cookies. You can change your settings at any time. Cookie Policy.


Basel II Compliance Drives the Importance of Master Data Management

Originally published October 18, 2005

In recent months, there has been tremendous turmoil within major financial institutions.  Many Basel II compliance programs have “hit the wall,” in an attempt to tackle too much too quickly. As a result, the programs were dramatically altered in scope. At the same time, much confusion ensued in Washington. Here, the media began reporting the possibility of a three-year delay for the full Basel II Accord implementation (see American Banker’s “Backlash on Basel Hits Fed” October 6 article).

When the Basel II Accord was introduced, most major banks believed it would dramatically elevate the way credit risk could be managed. The shift to an actively managed portfolio, complete with the potential for real risk-based pricing, was a compelling dream. This idea helped launch Basel II compliance projects that focused on building—for the first time— enterprise level data warehouses for risk data. Since most major institutions had not ventured into large-scale, enterprise data projects before, new ground had to be tackled in various areas. These areas included data quality, data ownership, data governance and Sarbanes Oxley’s new sourcing and publishing processes. This all led to very expensive data projects that ran out of time three months ago. Basel II compliance deadlines were looming and eventually “hit the wall.”

To some, the notion of wall-hitting may evoke memories of Pink Floyd, with a twist on the lyrics to “We don’t need no regulation …We don’t need no thought control…” For Basel II compliance project managers, though, hitting the wall meant significantly altering projects for meeting deadlines. Instead of focusing on the warehousing of enterprise data, the basic change was to concentrate on producing required reports. The shift was toward business intelligence and away from data warehousing. 

Along with this shift in focus, fascinating discussions and concerns were brought up. In August 2005, I attended a Basel II compliance team meeting for one of the top 10 U.S. banks. This team included bank executives, risk managers and technology leads. The meeting, which lasted three hours, was fascinating to me. It featured the technology people educating the “user community” about the importance for having master data, metadata, and reference data strategies and tools. Another interesting discussion regarded the benefits of developing a robust semantic layer. Previously, I have seen these topics put non-technology people to sleep faster than a hypnotist. But this was different. Here, the executives and risk managers actively participated in the discussion. They earnestly sought to understand these concepts and data tools, and how their use was an important component to meeting the Basel II compliance deadlines.

In subsequent meetings, the executives and risk managers began to see how metadata and reference data could play a critical role in implementing a robust data quality process.  They quickly agreed that data and reference data were the key tools for controlling the data governance process. More fun than anything, was observing how people began to understand the “semantic layer,” as well as how a robust implementation could greatly accelerate both the accuracy and production of reports. 

All of these concepts became real and tangible for them. Essentially, they began to link together and make sense. The chart below illustrates this concept:

Because of this, I proclaimed “thank you, Basel!” in the article’s summary. For the first time, we are helping senior executives better understand why data and business intelligence infrastructure is an important and necessary expenditure.

  • Duffie BrunsonDuffie Brunson

    Duffie is a Senior Principal for Financial Services at Knightsbridge Solutions. With more than 30 years of experience in financial institutions as both a banker and consultant, he has been involved with leading-edge developments within the industry, including the creation of the automated clearinghouse, the debit card, in-home transaction services, co-branded credit cards, electronic payment networks, financial advisory/planning services and integrated customer data warehouses.

    Duffie holds an undergraduate degree from the University of Virginia and an MBA from Georgia State University. He is a graduate of the Seidman Auditing School at the University of Wisconsin, and the Stonier School of Banking at Rutgers University. He has served as a member of the ABA's Operations and Automation Quality Council, as a faculty member of the Graduate School of Banking at Colorado, and as a lecturer at the Universita' Cattolica del Sacro Cuore in Milan, Italy.

    Duffie can be reached at dbrunson@knightsbridge.com.

    Editor's note: More financial services articles, resources, news and events are available in the Business Intelligence Network's Financial Services Channel. Be sure to visit today!

Recent articles by Duffie Brunson

 

Comments

Want to post a comment? Login or become a member today!

Be the first to comment!