We use cookies and other similar technologies (Cookies) to enhance your experience and to provide you with relevant content and ads. By using our website, you are agreeing to the use of Cookies. You can change your settings at any time. Cookie Policy.

Business Analytics: Considering the Spectrum of Analysis

Originally published July 28, 2011

Over the past few months we have essentially looked at organizational preparedness for introducing a business analytics capability. At some point, with management approval, allocated budget, and engaged business consumers, it is time to start considering how to map the business users’ expectations into a realistic plan for source data selection, evaluation, preparation, and analysis.

A prelude to planning the data selection is considering the different types of analysis that can be performed, and what is expected to be achieved as a result. In our customer engagements, we typically see a transition in the level of maturity in the types of analyses and techniques that correspond to the levels of expected value. As the organization begins to produce actionable knowledge, the consumers become engaged in finding new ways to analyze data for increasingly better outcomes. Another way of describing this looks at a sequence of sophistication of the business questions, such as these:

  1. What, namely what has happened in the past? Many organizations do not have a general means for what might be called “general reporting” other than perhaps extracting data from their financial system. While those extracts might provide some level of detail about overall business performance, there might be questions about more specific details.

    For example, the financial system might report the total amount of monthly sales, but the sales managers would prefer to have details of what items were sold by which sales people, by region, time, and channel.

  2. Why, or what are the key issues that influenced what has happened in the past? The next level of sophistication goes beyond reporting what has happened, but has people beginning to look at the root causes of what has happened and see if there are any influential patterns or behaviors that are indicative of success (or failure).

    To continue our example, once the sales managers have some visibility into historical behavior, they might want to align the data across different dimensions or add additional data and attribution to determine where the best performance occurred (e.g., what were the geographic areas with the best profitability), and what are the key distinctions that might have led to high performance (e.g., how do those areas differ from all the others?).

  3. What if, or what results might we see presuming we made changes to our processes? Given the ability to evaluate the potential influencing characteristics that lead to success or failure, and presuming an ability to identify the dependent variables that influence the outcomes, can we assemble a model to help assess how changes in the environment, processes, and perspectives might lead to better outcomes?

    Again, we can look to our sales example – if we were to determine that a small handful of the salespeople account for a large percentage of the sales, what if those salespeople were assigned to different geographical regions? Would sales in those regions increase? Would sales in their former regions decrease? Are there characteristics of the residents of each region that impact the ability to make sales? Can we look for other regions with similar population characteristics?

  4. What next, or what are some recommended changes and how are those recommendations prioritized? With the right kinds of models, you can evaluate the different scenarios, and if you recall one of my earlier articles on success criteria, those measures and metrics can be employed at this point to measure the potential for increased value for any recommended alternatives. In our sales example, by rearranging sales territories, we can increase sales in certain regions, but that might decrease sales in the regions abandoned by our super sales- people. We might be able to find regions that are not adequately served, but there might be investment costs up front (advertising, setting up offices, etc.) that might impact profitability for a period of time.

  5. How, or what are the most efficient methods for instituting changes? Different risk factors may come into play as well, so augmenting the models with predictive analyses and probabilistic measures of success and prioritization will help guide the decision-making process.

To reflect back to the sales example, you may find that in many cases a certain sequence of events may lead to closing a sale, but at some point the probability decreases. Knowing where the process breaks down and how to change the process can advise the salesperson in real time of whether to continue to try to close the deal, alternative approaches to assist in closing, or whether to drop the prospect and concentrate on another with a higher probability of closure.

Of course, we are using sales as an example, and we can refer to any of our previously identified success measures to drive the process. But let’s reflect on these levels of analysis sophistication: It is much easier to create an application that delivers predefined reports answering existing questions about organizational operations than it is to optimize specific areas of the business. In fact, these levels present a spectrum of analysis, and each progressive level described here builds on the previous levels. In essence, you can’t improve what you can’t measure, as good ol’ Lord Kelvin used to say.

This suggests two things to keep in mind. The first is that as designers and implementers, we need to gauge user expectations and help ratchet those expectations to match the level of sophistication of the analytics capability. Your users might have been sold on the idea that instituting an analytics program was going to improve the business, but they have to be aware that the value is evolutionary, not revolutionary (that is, sales don’t increase when you crack the shrink wrap).

The second is that this evolution has to be incorporated into the roadmap and the program plan. The deliverables of the projects should be mapped to the expected capabilities; at the same time, the user communities can be trained as to how they can make best use of the analytics capabilities as they are deployed.

Recent articles by David Loshin



Want to post a comment? Login or become a member today!

Be the first to comment!