<-- Back to full color view
Impacts of Institutionalized Reactivity to Data Errors
Originally published January 27, 2011
One of the more frequent types of engagements that my consulting company does is to perform a data quality impact analysis. This involves a series of interviews with many different types of people in the organization, ranging from front-line operational staff to senior managers in strategic roles and many others in between. While the results of the impact analysis vary, the business impacts are largely the same and, in many cases, the types of errors are also largely the same. So this month I thought I would provide some details about one of the most common findings: institutionalized reactivity.
Perhaps a good way to explain what I mean is by first providing an example. At one client, I spoke with a number of business and information technology people. I found that within each line of business, there was a clear understanding of particular types of data issues—enough so that each group had some number of full-time staff members dedicated to addressing the issues as they were identified. The interesting part of this is the euphemistic terminology. “Addressing the issue” actually meant going into the target systems, fixing the data errors, and then notifying the source system owners of the error. “Identifying issues” meant that a customer had seen an error on a statement or in an online account and had called the call center to register a complaint.
This is not an isolated scenario. Rather, we have seen this many times in many different types of industries. Essentially, within each operational group, there is enough of a general concurrence as to the existence of data problems that staff time is allocated to reacting to the problem and fixing it when it is discovered at a late stage by members of the one constituency that should be the least exposed to data errors: the customers. Often there are good estimates of the amount of effort being expended in reacting to the issues (“10% of staff time,” “4-5 hours per week,” “an average of 2 hours per incident, 4-5 incidents per week”). This is what I mean by institutionalized reactivity: the assumption that since data errors are expected and the path of least resistance is to regularly assign the task of fixing the errors when they happen, blend the data quality activities directly into a day-to-day job description.
And if you were to add up all the time allocated to this kind of reactive work, you might find that it amounts to a few full-time staff equivalents per year. In many cases, the cost of these staff members is relatively small in the scheme of things, especially when the remediation is performed by low- or mid-level employees. So here is a deeper question: what are the real impacts that are attributed to institutionalized reactivity?
To answer this question, let’s focus on the high-level impact categories discussed in my book: Financial, Productivity, Risk, and Satisfaction (see www.dataqualitybook.com for additional notes and pointers, by the way). We can consider the different ways the organization is affected, but organized in a slightly different order:
- Satisfaction. In many cases, this forms one key component of the issue: Is the organization relying on its customer constituency to be its quality control process? If so, to what extent does that impact overall satisfaction? Discussions with some clients suggest that exposure of data errors to the customer damage the company’s credibility, which can lead to decreased satisfaction and ultimately customer attrition.
- Risk. Often, some data processing and production errors lead to violation of business rules (at least) or even legal noncompliance (at worst) regarding security of access to customer data or exposure of (what should be protected) private data. In some cases, exposure of private data can trigger a series of additional impacts—mandated reporting to government agencies, negative coverage from regional media outlets, initiating customer credit reporting, etc.
- Productivity. Presuming that staff members are allocating their time to researching and fixing data problems, it means that time is not being spent doing the “stuff” they were originally assigned to do. At one client, I was told that a small number of marketing analysts spent 80% of their time extracting, fixing, massaging, and preparing data for analysis and only 20% of their time doing the analysis. Their value as marketing analysts could have been increased fourfold if only those percentages were reversed.
- Financial. From this standpoint, there is a perceived incurred set of costs related to the staff time allocated to fixing the problem. But within the previous areas of impact, we see follow-on costs that are much more comprehensive, including increased attrition, reduction in customer lifetime value, costs relating to managing exposures, “spin” (such as paying for credit reporting), and loss of value because of productivity gaps.
A cursory view would suggest to managers within each of the business areas that the issues are not critical enough to warrant a level of urgency to understand and consequently eliminate the root causes. Yet the cumulative effect of the downstream impacts coupled with what is probably a “worst practice” of having the customers do the data quality validation might indicate that institutionalized reactivity should be replaced by proactivity instead.
SOURCE: Impacts of Institutionalized Reactivity to Data Errors
Recent articles by David Loshin