We use cookies and other similar technologies (Cookies) to enhance your experience and to provide you with relevant content and ads. By using our website, you are agreeing to the use of Cookies. You can change your settings at any time. Cookie Policy.


A Capability Model for Business Analytics, Part 2 Assessing Analytic Capabilities

Originally published June 9, 2009

The first article of this series presented the capability model for business analytics that is illustrated in Figure 1. The intent of the model is to provide organizations with a means to quantify analytic capability. The model encompasses three dimensions. Vertically analytic capability is stratified into six levels based upon those of the SEI Capability Maturity Model from Carnegie-Mellon University. These levels describe the degree of discipline with which analytic processes are managed. The second dimension – analytic roles – has only two members: analytic providers and analytic consumers. The third dimension – analytic usage – describes seven applications of analytics ranging from performance to geospatial analysis.

 
alt


Together these dimensions produce a model with 84 cells. Now let’s take the model from theoretical to something for practical application. The goal of the model is to quantify the level of an organization’s analytic capability level with sufficient indicators to provide a measure upon which decisions can be made and actions taken – to form the basis of analytic capability assessment. Ideally that assessment should be fast and light – something that is easy to perform yet provides informative results.

The Capability Assessment Process

To assess we must first collect data, and the matrix model provides the structure for data collection. Fortunately it isn’t necessary to examine all 84 cells of the model to assess analytic capability. The capability levels are, in fact, a product of the assessment and not variables for data collection. The roles dimension and the usage dimension intersect to create fourteen data points as illustrated in Figure 2.

 


(mouse over image to enlarge)



A total of fourteen data points certainly fits the criteria of fast and light. A small number of data points, however, requires every data value to be correct to achieve accuracy in the assessment. To provide informative results, each response must be carefully considered. The ideal assessment process involves a group of stakeholders who represent a broad cross-section of the business analytics community – those who create analytics and those who use analytics throughout the core functions of the business.

The purpose of this group is to arrive at consensus responses for each of the fourteen data points. Consensus, of course, begins with discussion. A simple spreadsheet provides the structure to drive discussion and constrains the set of responses such that they are quantifiable. Figure 3 illustrates the assessment spreadsheet that you can download here.


(mouse over image to enlarge)



The yellow shaded areas are the spaces in which responses are collected – one column of responses for creating analytics and one for using analytics. The dropdown lists limit the set of responses in each column. Behind the scenes, the spreadsheet converts text responses to numerical values and derives capability assessment scores in the columns on the right. We’ll look at the scoring portion later in this article. But first, let’s look more closely at collecting the data.

As previously stated, each response must be accurate for the assessment to have value. Responses must be carefully considered and represent consensus that is achieved through discussion. Accurate responses, then, require two support structures – defined terms and discussion guidelines.

Definitions of Terms

Choosing the correct response for each cell in the spreadsheet requires that you understand the terms that describe the cell and the terms used to define the set of allowed responses. There are three sets of terms for which definition is needed:
  • The members of the roles dimension – analytics creators and analytics users.

  • The members of the usage dimension – performance, behavioral, predictive, causal, discovery, textual, and geospatial analyses.

  • The allowable set of responses – evolving portfolio, integrated and reused, repeatable processes, development projects, measurement and feedback, embedded analytics, standard metrics, defined requirements, pockets of competency, and rare or none.
THE ROLES

Analytic Creators: The people who collect measurement data, define metrics, and build the processes to derive and deliver those metrics to analytic users.

Analytic Users: The people who use measures and metrics to gain insight into business events and circumstances, to develop foresight into future business performance, and to inform and enhance decision-making processes.

THE USES

Performance Analysis: Use of measures and metrics to understand performance of business processes, to improve process effectiveness, to eliminate processing bottlenecks, and to make efficient use of financial, human, material, and other resources.

Behavioral Analysis: Use of data, measures, and metrics to understand and classify patterns of events and responses in customers and consumer groups. Behavioral analysis is a foundation element of targeted marketing, personalized customer communications, and many aspects of customer relationship management (CRM).

Predictive Analysis: Use of data, statistical modeling, and data mining techniques to find patterns and probabilities in historical data and apply those patterns and probabilities to make predictions about future events and conditions. Credit scoring is perhaps the most widely known example of predictive analytics, but many other applications are possible. Predictive analysis can be applied in virtually any industry or business functional area.

Causal Analysis: Use of data, measures, and metrics to find the causes of particular events or conditions. Causal analysis seeks to get beyond symptoms to find root causes, and it recognizes that multiple causal factors influence every event or condition. Causal analysis identifies the strength of influence of each causal factor, providing insight that is valuable to decision-making processes. Internet shopping provides an example of multi-factor cause and effect. Among the factors that cause a prospect to buy or to not buy are ease of navigation, product selection, product pricing, payment options, and many other variables.

Discovery Analysis: Examination and investigation of data to find new facts or relationships, and to become aware of things previously hidden or unknown. Discovery analysis is most commonly performed using data mining techniques to uncover hidden patterns in data. Opportunity recognition and fraud detection are common applications of discovery analysis.

Textual Analysis: Examination, investigation, classification, and structuring of text data to derive high-quality, searchable, and accessible information from a body of unstructured text. Text analysis includes parsing, lexical and semantic analysis, clustering, tagging, and pattern recognition to create structured data from unstructured data, to associate text with keywords, to quantify relevance, and more. Common uses of text analytics include content management and online media, security, marketing, legal discovery, and biomedical research, and academic applications.

Geospatial Analysis: Use of geographic and location data to understand business patterns, behaviors, and events. By combining geographic and location data with other business data, new insights become possible that enhance decision-making processes and help to optimize business processes. Geographic mapping and visualization tools make it easy to see trends and patterns in large volumes of data. Some examples of business and government applications include telecommunications network planning and design, retail site selection, transportation logistics, urban traffic planning and management, and crime analysis in law enforcement.
 
THE RESPONSES

Evolving Portfolio: Analytics development is based upon a systematically managed collection of analytic capabilities and systems that is aligned with business processes and information needs, and that continuously adapts to business change.

Integrated and Reused: Analytic development processes have sufficient data governance discipline to ensure that measures and metrics are consistently defined, non-conflicting, non-redundant, and reused across business functions and analytic applications.

Repeatable Processes: Analytic development processes have a methodological foundation. They are defined, documented, repeatable, and teachable processes that are used consistently across the community of analytics developers.

Development Projects: Analytics development activities are formalized as projects with all of the key project management elements of planning, execution, monitoring, control, and closure.

Measurement and Feedback: Business analysis is an integrated and central activity of business management. The metrics-based management and decision-making culture is one of routine measurement and continuous feedback.

Embedded Analytics: Analysis is an integral part of many business processes. Analytics and the analysis procedures and activities that use those analytics are embedded into the business processes.

Standard Metrics: The business has a set of defined and standard metrics for core processes, entities, and performance indicators. Analytic users are familiar with and use the standard metrics.

Defined Requirements: Business analysis activities are more than ad hoc exploration of data. Analysis begins with a statement of purpose and needs for information, understanding, or insight.

Pockets of Competency: Some people in some areas or departments have the ability to create and/or use analytics in ways that create valuable business knowledge or insight. Those areas that have analytic competencies are the primary analytic users.

Rare or None: Creating and using analytics is not a common practice of information management.

Discussion Guidelines

Developing a consensus set of responses for each cell of the spreadsheet depends on communication and discussion. In addition to common definitions, it is useful to have a set of topics to guide a complete and multifaceted discussion. I suggest several topics as a good discussion agenda:
  • Analytic architecture

  • Business capabilities

  • IT capabilities

  • Analytic culture

  • Analytic technology

  • Decision processes

  • Data management

  • Data integration

  • Data quality

  • Data accessibility

  • Desktop analysis

  • Collaborative analytics
Combine each topic with roles and uses as illustrated in Figure 4 to develop a rich and robust agenda for discussion. 


(mouse over image to enlarge)


Assessment Results

Analytic capability scores are calculated when all of the responses have been considered and entered into the spreadsheet. Capability scores are derived for each analytic role and for each analytic use. Scores are aggregated for each dimension and in total for the entire assessment as shown in Figure 5. 


(mouse over image to enlarge)



Seeing the scores is interesting, but the numbers are only indicators. Some interpretation is needed to make the transition from interesting to informative. And some planning is needed to make the shift from informative to actionable. Understanding and applying the capability assessment is the subject of the next and final article in this series.



  • Dave WellsDave Wells

    Dave is actively involved in information management, business management, and the intersection of the two. He provides strategic consulting, mentoring, and guidance for business intelligence, performance management, and business analytics programs.

Recent articles by Dave Wells

 

Comments

Want to post a comment? Login or become a member today!

Posted June 10, 2009 by lcampane@ur.com

Look forward to part 2

Is this comment inappropriate? Click here to flag this comment.