Top 10 Trends in Business Intelligence and Data Warehousing for 2005 Revisited

Originally published June 7, 2005

In late 2004, Knightsbridge Solutions sought to peer into the future and divine the top trends for business intelligence and data warehousing. I thought it would be fun to revisit those predictions and do a mid-year check to see if the trends are still on-track.

Based upon our work with Fortune 500 companies and discussions with many other companies and experts in the field, Knightsbridge's top 10 business intelligence and data warehousing trends for 2005 are reflective of what's happening within the financial services industry. Business intelligence and data warehousing have become major priorities across most enterprises in financial services.

Validating this movement, for the first time business intelligence and data warehousing made Gartner's list of top CIO priorities in 2004. If we look at the major business issues in 2005, all of them require greater access to data - data of "certified" quality and accuracy, data at an enterprise versus line-of-business (LOB) level all coupled with an enhanced level of analytics aimed at managing the business from a fact-based perspective. Business intelligence is no longer a synonym for "reporting." It truly has grown into its last name - "intelligence." The push is squarely behind using data and analytics to reduce the uncertainty involved in managing a large enterprise.

The push behind the acceleration of this trend is clearly Congress and the regulators. With an increasing tempo of compliance dates approaching, every management team in the financial services industry is looking to their technology teams to deliver—on time—the data and the “intelligence” within the data. The interesting thing about this push is that, to deliver, the technology teams are pushing for major investments in basic data infrastructure and in data governance. This, in turn, is pushing management teams to reconsider traditional LOB focused organizations and to approve spending that has been avoided over the past five to seven years.

All of this, in turn, has pushed management teams to realize that the ROI for these huge investments in data and data infrastructure can only be realized by fundamentally changing the way they manage their businesses. Hence, the push to not just meet regulatory or legislative requirements but to move beyond those limits to ensure that fact-based management can flourish and evolve in the future.

With those observations in place, let’s now revisit the predictions and see how they have held up over the early months of 2005:

Trend #1: Taking data quality seriously

Original Observation: Very few enterprises set out to remedy data quality problems just for the sake of data quality. So what’s pushing enterprises to actually do something about data quality, instead of just talking about it? First, poor data quality costs them money in terms of lost productivity, faulty business decisions and an inability to achieve results from expensive investments in enterprise applications. Second, poor data quality can make regulatory compliance extremely difficult.  It’s true that many companies have cleaned up their customer data to enable CRM-related initiatives. However, their focus has now turned to data in other areas of the business, such as supply chain and finance, and to tackling what can seem like intractable data quality problems in nearly every business domain.

META Group, now owned by Gartner, predicts that the market for data quality software and services will grow 20 percent to 30 percent annually through 2007, supporting the observation that companies are committed to actually doing something about their data quality problems. However, it can be tough for organizations to know where to begin. For example, how can data quality be measured so that problem areas can be identified? Which issues should be addressed first? There are numerous methodologies and tools available to help companies sort through their data quality issues, but it’s important to choose one that takes a practical approach. Addressing data quality in the abstract is impossible. 

REVISIT: This trend is holding solid, especially within the financial services industry. This is especially true for those firms seeking to comply with the Basel II Accord. Inaccurate data has no place in a risk system used to price loans and calculate capital requirements for the institution. The interesting shift here is a dramatic change in thinking regarding how to meet the data quality challenge. At first, management focused on tools to measure quality with the thought that the “owner” of the data should fix the problem at the source system—e.g., scrubbing and re-training. Now, there is a fundamental shift in thinking. Data quality is about far more than entering data correctly the first time. There is now a realization that data quality must be baked into the ETL/sourcing processes that stage data for analysis.

Trend #2: Infrastructure standardization and consolidation

Original Observation: If someone asked you right now how much your enterprise spends on business intelligence and data warehousing annually, would you be able to come up with a reasonable estimate? If your company is like most organizations, the answer is probably no. Enterprises tend to know what they’re spending for ERP and other core systems, but not for business intelligence and data warehousing. That’s because business intelligence and data warehousing efforts have largely been undertaken in silos, with each business domain creating its own solutions to the problem of obtaining and analyzing data. This siloed approach almost always results in duplication of efforts, inefficiency and increased expense.

Enterprises have come to recognize their disparate business intelligence and data warehousing solutions as a problem over the past couple of years. Their interest has been particularly piqued in these lean economic times, when eliminating duplicate business intelligence tools or data marts might result in lower license costs and maintenance expense. Improved access to information, while more difficult to quantify, is also an important benefit of eliminating silos. However, standardizing and consolidating a business intelligence and data warehousing infrastructure is far easier said than done. It involves political and organizational issues that are just as challenging as the technology issues. 

REVISIT: This trend is on track. In the financial services industry, the old “reporting” aspects of business intelligence and data warehousing were LOB issues and expenditures. Thus the “enterprise” never really knew the true cost of business intelligence and data warehousing. With the need to meet legislative and regulatory mandates, the costs associated with rebuilding the infrastructure to support enterprise data access and analysis has risen to the board of directors. It is clearly seen as a major and exceedingly high-risk expenditure from two perspectives:

  • First, if the project fails they will face sanctions, fines in the tens of millions and possible jail-time; and
  • Second, most of these large data projects are estimated to be the largest single capital expenditure ever made by most institutions.

Trend #3: Offshore sourcing for business intelligence and data warehousing

Original Observation: Although there has been much buzz (and controversy) surrounding offshore sourcing in the past several years, companies have been hesitant to send their business intelligence and data warehousing work offshore because it requires more business knowledge and customization than other types of projects.  However, the cost savings aspect of offshore sourcing can be irresistible, especially with pressure coming from the top of the organization to do more with less.  Enterprises look at the hourly rates of offshore personnel and don’t see how they can lose with costs that cheap.  Lured by the promise of cost savings, enterprises have increasingly taken the leap to offshore for some of their business intelligence and data warehousing development work.  The initial verdict on the results is somewhat mixed.  Some companies have found that despite offshore hourly rates that are only a fraction of onshore costs, they haven’t achieved the savings they thought they would.  Quality problems, communication issues, and other difficulties companies may not have foreseen have led to less-than-successful offshore initiatives.

Enterprises can’t just flip a switch, go offshore, and expect to reap the cost savings—that sounds so simple and obvious, but many companies have tried to do just that.  More thought needs to go into organizing offshore efforts.  Which activities should be sent offshore?  How can an enterprise best work with offshore personnel?  At the same time, how can an enterprise help its in-house personnel accept the offshore model and become more valuable to the organization?  Implications of offshore sourcing must be considered carefully before jumping in with both feet, especially in the case of business intelligence and data warehousing.  Cost savings can be had, but not without careful planning.

REVISIT:  This trend is also holding solid—with a twist.  More and more financial service institutions are negotiating contracts with offshore firms.  However, they are negotiating Service Level Agreements (SLAs) that actually have teeth in them to protect the cost advantage they are trying to capture.  As they push for these strong SLAs, two things are happening:

  • First, the offshore companies are starting to develop an ONSHORE presence so that they can better monitor their SLA exposure and develop clear communications with their clients.
  • Second, financial institutions are realizing that their decision is not just onshore vs. offshore.  It is evolving to be more complex.  Seeking to upset the cost advantage of the traditional offshore companies, onshore firms are developing capabilities that expand the decision to onsite vs. offsite vs. offshore. 

With these developments, the business case for offshore has been complicated.  This has brought forward the need for financial institutions to develop processes and SLAs that will enable them to tap into this more productive but more complex environment.

 Trend #4: Strategic approach to information

Original Observation: Slowly but surely, enterprises are recognizing information as a strategic part of their business.  Very few have put the idea of “data as an asset” into practice, but within many organizations there is a group of individuals who recognize the strategic value of information.  Members of senior management have become increasingly receptive to this viewpoint.  What’s the evidence that this is happening?  Business intelligence and data warehousing has become a critical part of other projects with far-reaching implications for the business.  Companies may not be implementing enterprise-wide business intelligence for business intelligence’s sake, but they are incorporating business intelligence and data warehousing into other key enterprise projects that promise to optimize business processes and deliver benefits to the bottom line.

So how does a company begin to treat its information as an asset?  Developing an information strategy and architecture is the first step, along with setting some basic standards for data governance.  This needs to be a joint activity of IT and the business because organizational and political issues are just as important as technology issues.  IT must communicate with the business to discover their needs and understand which data is really driving the answers to crucial business questions.

REVISIT:  Another trend that is holding firm.  It is interesting that in the last five months, I have been asked to hold seminars for senior executives and the C-suite group (CEO, COO, CRO, CCO, etc.) to address fundamental questions like:

  • We have spent tens of millions of dollars on our data warehouses. Why are we not seeing a gain in productivity or revenue?
  • Every time I ask for a report, I am told it will either take at least eight weeks, or I am told we do not have the data.  Why is this?

As we stated earlier, legislative and regulatory mandates have fundamentally shifted management’s view of an appreciation for solid, accurate and high-quality data.  This, in turn, is driving the need for the development of enterprise data strategies – the true mark that data has become a valuable asset.

Trend #5: Regulatory compliance as a driver for business intelligence and data warehousing

Original Observation: Achieving compliance with the Sarbanes-Oxley Act of 2002 is a major concern for executives.  In short, Sarbanes-Oxley is designed to reinforce the financial accountability of public companies by requiring executive certification of financial results, accelerated reporting of quarterly and annual financial results, and rapid reporting of any events that may materially impact a company’s financial condition.  Achieving compliance with these requirements will certainly demand improvements to companies’ data infrastructures and reporting capabilities.  And Sarbanes-Oxley is only one of the new regulatory directives companies are dealing with.   Environmental and data privacy regulations are other issues that have companies scrambling to make sure they have access to the necessary data. 

This scrambling has driven investment in solutions that enable information access (or security, as the case may be) and ensure data accuracy.  Regulatory compliance has become a key impetus for undertaking business intelligence and data warehousing initiatives and has attached some very real consequences to not having access to quality data.  In the case of Sarbanes-Oxley, META Group estimates that business intelligence and business performance management (BPM) account for a full 30 percent of the technology profile of a successful solution.  Many enterprises may be tempted to ignore the business intelligence and data warehousing component to focus on their ERP and financial systems, but the visibility and transparency provided by business intelligence and data warehousing solutions is just as critical.

REVISIT:  There is no doubt that this trend still holds true for financial services – in spades!  In fact, most financial service industry management teams dream of having to worry only about Sarbanes-Oxley.  As it is, they have to confront Basel II, US Patriot Act, Graham-Leech Bliley, The Bank Secrecy Act, and investigative probes by various attorney generals in numerous states. These pressures not only drive investment, they drive anxiety and uncertainty – both powerful forces.

Please stay tuned as we revisit the final five trends in the next Business Intelligence Network Newsletter on June 14th.

  • Duffie BrunsonDuffie Brunson

    Duffie is a Senior Principal for Financial Services at Knightsbridge Solutions. With more than 30 years of experience in financial institutions as both a banker and consultant, he has been involved with leading-edge developments within the industry, including the creation of the automated clearinghouse, the debit card, in-home transaction services, co-branded credit cards, electronic payment networks, financial advisory/planning services and integrated customer data warehouses.

    Duffie holds an undergraduate degree from the University of Virginia and an MBA from Georgia State University. He is a graduate of the Seidman Auditing School at the University of Wisconsin, and the Stonier School of Banking at Rutgers University. He has served as a member of the ABA's Operations and Automation Quality Council, as a faculty member of the Graduate School of Banking at Colorado, and as a lecturer at the Universita' Cattolica del Sacro Cuore in Milan, Italy.

    Duffie can be reached at dbrunson@knightsbridge.com.

    Editor's note: More financial services articles, resources, news and events are available in the Business Intelligence Network's Financial Services Channel. Be sure to visit today!

Recent articles by Duffie Brunson

 

Comments

Want to post a comment? Login or become a member today!

Be the first to comment!