Blog: James Taylor I will use this blog to discuss business challenges and how technologies like analytics, optimization and business rules can meet those challenges. Copyright 2012 Tue, 23 Nov 2010 09:37:51 -0700 Analytics and the art of selling Cross-posted from JTonEDM

I saw this interesting McKinsey piece recently - Rediscovering the art of selling - McKinsey Quarterly - Retail & Consumer Goods - Strategy & Analysis - and I was struck by the value of analytics in this context. What retailers really need to do, according to McKinsey, is focus on hiring sales people with personality, extroverts motivated by helping customers. And they need to spend time training this folks on sales techniques, approachability, reading body language (to tell who wants to be left alone) and much more. Do this, the article says, and your closing, cross- and up-selling will be far more successful. So far so good.

But the reality of a modern retailer is that there are a tremendous number of products with lots of potential cross- and up-sells to choose from. Even someone with the skills you need might not be good at, say, color matching making it hard for them to make the right clothing choice for an outfit. Add in multiple discounts, loyalty programs and other forms of dynamic pricing and you have a complex environment. Retailers feel that have to invest resources and time in training staff about these things, reducing the time available for sales skills training, and even that they must hire for an ability to understand this complexity even at the expense of the personality and sales skills they need. A case of being between a rock and a hard place?

No, enter analytics. With decent analysis of their historical data and a focus on the decisions that have to happen during the sales process, retailers can spend their time and energy training staff on the sft skills they need and let their systems and analytics do the rest of the work. They can use business rules/analytics and decisioning to answer questions like what discount does this customer get, what's the best up-sell for this customer given their purchases, what's the best cross-sell that will complete the outfit they are buying. They can analyze sales data, loyalty card data and external data and use rules derived from this data or from their best sales people. If they adopt the "swipe first" loyalty card approach they can empower their staff to do even more by leveraging everything they know right at the start of a conversation.

Your staff don't need to be able to make all these "technical" decisions - you can build systems that act as effective advisers to them freeing them to work on their people skills, customer interactions and actually selling.

]]> Predictive Analytics Tue, 23 Nov 2010 09:37:51 -0700
A really fresh look at business analytics A friend sent me a link to a webinar on "The End of BI as We Know It" that promised "A fresh look at what business analytics means". It wasn't clear who was speaking or what company was sponsoring but the title intrigued me (as it was meant to). But when I looked at the body of the description I was underwhelmed. There was, frankly, nothing new. The webinar promised to explain several things - presumably things that were "fresh" or not "BI as we know it". But here's the list (edited to summarize):

  • Speed to deploy, to build, to get analysts serving themselves is critical
  • Must be able to analyze data from production databases and handle millions or billions of records
  • It's critical to combine multiple data sources from the data warehouse to Excel
  • Must be able to easily and quickly build dashboards

All this is pretty mainstream as far as I am concerned - no-one wants a BI tool that is slow, that can't access data from various sources, that can't handle lots of rows or that doesn't let you build dashboards.

And there was nothing about decision making, nothing about supporting different kinds of decision-making (from collaborative, strategic decisions to high-volume operational decisions), nothing about data mining or predictive analytics, nothing that fundamentally changes how companies can put data to work improving their business.

I gave a speech some months back in South Africa called "Does BI Matter" (audio and slides here in a large PDF) and I have blogged before about Why thinking about decisions should be a BI best practice. If you think you can improve day-to-day operations by giving everyone dashboard or reports then you haven't visited your call center lately. If you think that the way your systems work, the way your website works, should not also be improved by applying analytics then you underestimate the extent to which your systems are your business. If you think the time it takes to build reports or the ease with which you can build dashboards are the critical measures of success then you are focused on means and not ends.

The reason you spend money on Business Intelligence is to provide intelligence about your business not just so you can have a BI platform. You have BI to make better decisions, to improve the way you run your business. If you don't know which decisions you are improving then you are unlikely to make progress.

If that list really counts as "fresh" and "the end of BI as we know it" then we should all be worried....

]]> Decision Management Thu, 07 Oct 2010 09:48:51 -0700
Why thinking about decisions should be a BI best practice I was struck today by a short but effective Information Builders PowerPoint - Four Worst and Four Best Practices in Business Intelligence. I really liked the worst practices - especially the one about assuming that business people have the skills or time to learn to use a BI tool. I blogged not long ago about the problem that most people are not very good at math and this is just as true when considering BI more generally as when thinking about data mining and analytics.

It's also true that many of the people targeted by BI tools don't have the time to use drill-down and analysis tools. Think about the folks in the call center - they want answers, not an ability to explore, so that they can finish the call. This is why it is important to think about the decisions involved and who you want making them. Knowing the decision and the decision maker will help you determine if you need BI tools to help decide or analytics and rules to automate that decision. And remember, just because someone passes on the result of a decision does not mean that the same person is qualified to make the decision. A call center representative might be the one to pass on a denial of a refund for instance but you might want someone else to decide which refunds get denied. Automating the decision can allow one person to control how the decision is made while others pass on these decisions.

I was also struck by the worst practice of selecting a BI tool without a specific business need. I spoke about this when I presented in South Africa earlier this year. If the reason you buy a BI tool is just to have BI then you probably aren't helping your company as much as you could be. Understand the business drivers - the decisions that must be made, the reports required for compliance - and you will do better. You can check out the slides and audio from this presentation on my website - Does BI Matter? (large file warning)

And this brings us back to my favorite Best practice - identify your business need upfront. Or, I would say, begin with the decision in mind.]]> Business Intelligence Thu, 09 Sep 2010 18:30:56 -0700
Fall webinar series
  • Simplifying over-complex processes
  • Delivering customer centricity across multiple channels, multiple platforms
  • Implementing analytics? You need business rules
  • Decision analytics - more than BI and web analytics
  • You can find all my upcoming events in the Events Calendar.]]> Decision Management Mon, 30 Aug 2010 11:59:18 -0700
    Gathering BI/DW requirements Data Warehouse / Business Intelligence Requirements Elicitation. Where do You Begin? I really liked the fact that early in the discussion the author said:
    After establishing these strategic objectives, make it a priority to get your users talking about their work day, struggles, obstacles, and how they make business decisions as pertains to data (my emphasis).
    I think it is essential when working with data to focus on decisions and on how data and analytics might improve those decisions. I also liked the focus on data mining as one of the steps - not just reporting and "soft" analysis tools - though I would add that deploying the results of data mining needs some serious consideration also. In this vein I also wrote about data integration and the importance of keeping the decision in mind.
    ]]> Decision Management Tue, 24 Aug 2010 09:46:42 -0700
    More on decisions and decision management decisions, decision management and analytics. This was prompted by Tom Davenport's recent interview on the Sloan Business Review on Reengineering your decision making processes about analytics and how companies make decisions. This interview also prompted Boris Evelson of Forrester to write this blog post on decision management being possibly the last frontier in BI. Boris made a couple of excellent points in his post.

    First he pointed out that, while companies should consider decision making something they should understand and systematically improve, all decision making is not the same. First decisions can be divided into those that are fairly structured and follow well defined rules and approaches and those that are more unstructured and collaborative. Structured decisions tend to lend themselves to precise descriptions of how to make the decision and repeatable analytics. Collaborative or unstructured decisions tend to lend themselves to exploration and visualization tools in contrast. Decisions can also be divided into automated and manual decisions.

    Now, some time ago Neil Raden and I did some work on the characteristics of decisions. Boris's collaborative/structured division combines two - the approach to making the decision and how repeatable the decision is. Other characteristics that really matter when it comes to deciding how to automate or support decisions include how measurable the decision is, how long it takes to see whether you made a good decision or a bad one, and how much difference there is between a good one and a bad one.

    Whether you currently automate a decision or not, it seems to me, is a more transient characteristic of a decision - a consequence of other more fundamental ones. Companies should not be dividing up their decisions into manual and automated so much as conducting a decision audit or decision discovery to understand what decisions they have so they can make the right automation and decisioning technologies choices.

    The importance of ongoing measurement and analysis, however, is an area where Boris and I are in strong agreement. The three phases of decision management are decision discovery to find the decisions that matter, decision services to build components to handle those decisions and then decision analysis to ensure that you continue to improve decisions over time.

    As Boris points out, this last one is critical. If you don't track the results of decisions you will never know what works and what does not. This is part of the reason I think it is so important to map decisions to Key Performance Indicators or KPIs so that you understand how each decision contributes to the measures that matter to you. Beyond tracking, though, if you don't create a feedback loop so that you can improve decisions based on this your decision making will stagnate. This means it will get less good - decisions cannot be static as a good decision is good only in a context and that context changes continually. I would add that experimentation is also important. You need an ability to create challengers to your current decision making approach, test them on some decisions and compare results to see if a new approach would be preferable. If you look at companies successfully using analytics they have all of these - good decision results tracking, a formal feedback loop to keep improving a decision and an ability to challenge existing decision making with new and innovative approaches.

    ]]> Decision Management Wed, 28 Jul 2010 16:30:00 -0700
    Data integration and keeping the decision in mind Some time ago I was at a warranty conference and there was an interesting discussion about registration cards. You know, those postcard sized mailers you are asked to return to register your product. They often have all sorts of demographic and interest questions - asked by the company to flesh out its 360-degree view of its customers.One of the speakers was asked about this and he argued that, in fact, companies should ask for the absolute minimum information on these cards. This would, he said, increase response rates and would have little or no effect on the value of the data because all the demographic data could be purchased anyway once you had the list of customers and some basic information about them. In other words companies were identifying fewer customers because they were worrying too much about the amount of information they have about those customers. I took a couple of lessons away from this.

    First, always consider the potential for external data to improve an internal process. Just because you want some data it does not mean you have to ask the customer for it. Buying external data and integrating it might be more cost-effective. And you might find you can infer the data analytically too, using historical records like purchases or returns to derive customer characteristics like preferences or approach to online purchasing.

    Second it reminded me of the importance of beginning with the decision in mind. Too often I see companies embarking on data integration and quality initiatives designed to improve all their data - presumably so they can make better decisions - without really thinking through what those decisions are. If you begin, instead, with the decision, then you might find that you only need some of your data integrated, that some of it is good enough to make the decision (even though it is pretty dirty) or that some of the data you need has to be sourced from outside the company anyway.  If you don't know which decision you wish to make or improve then you can't know which data is truly important.

    ]]> Business Intelligence Wed, 28 Jul 2010 12:55:00 -0700
    New data mining competition IEEE ICDM Contest: Road Traffic Prediction for Intelligent GPS Navigation

    Over the last century, the number of cars engaged in vehicular traffic in cities has increased rapidly, causing many difficulties for all citizens: traffic jams, large and unpredictable communication delays, pollution, etc. Excessive traffic became a civilization problem that affects everyone who lives in a city of 50,000 people or more, anywhere in the world. Complexity of processes that stand behind traffic flow is so large, that only data mining algorithms may bring efficient solutions to these problems.

    The task of this year's ICDM Data Mining Contest is to predict city road traffic for the purpose of intelligent driver navigation and improved journey planning. The three contest problems are related to congestion forecasting, modeling of traffic jams, and smart navigation based on real-time GPS monitoring. Datasets come from a highly realistic simulator of city traffic, Traffic Simulation Framework. The competition is organized on TunedIT Challenges data mining platform by the team of researchers from University of Warsaw, Faculty of Mathematics, Informatics and Mechanics. Prizes worth $5,000 in total will be awarded to the winners.

    Everyone is welcome to participate. Competition starts now and will last till September 6th, 2010.

    More details here ]]> Predictive Analytics Tue, 06 Jul 2010 07:46:20 -0700
    Operational Analytics resarch available I have been working on some research with the BeyeNETWORK and we are pleased to announce the release of "Operational Analytics: Putting Analytics to Work in Operational Systems".

    The basic premise for this report is the power of analytics to improve decision making has huge potential when applied to the large numbers of decisions in operational systems. This requires new thinking about analytic techniques and technologies. This study defines terms, explores various analytic approaches and enumerates keys to success and best practices from actual implementations of operational analytics that show the payoff is real.

    The research was sponsored by Adaptive Technologies, Inc.; Aha!; Fuzzy Logix; Oracle; and SAS.

    You can read the executive summary or watch a webinar on the high points of the survey results. The report contains an overview of the opportunity and technologies involved, the results of an online survey, some great case studies and sponsor profiles.

    ]]> Decision Management Thu, 06 May 2010 16:52:39 -0700
    It is Rexer Analytics survey time Rexer Analytics, a data mining consulting firm, is conducting their fourth annual survey of the analytic behaviors, views and preferences of data mining professionals. This is a really interesting survey (I blogged about the results of the previous one here) and I encourage you to participate. The survey Link is (you will need the access code KP9260).

    Karl and his team say:
    Your responses are completely confidential: no information you provide on the survey will be shared with anyone outside of Rexer Analytics. All reporting of the survey findings will be done in the aggregate, and no findings will be written in such a way as to identify any of the participants. This research is not being conducted for any third party, but is solely for the purpose of Rexer Analytics to disseminate the findings throughout the data mining community via publication, conference presentations, and personal contact.
    You can email them directly at if you have questions or want to request previous research summaries. The survey should take approximately 20 minutes to complete.
    ]]> Predictive Analytics Mon, 26 Apr 2010 12:52:00 -0700
    Now that's cute - data mining on an iPad Oracle Data Mining on iPad - and I have to say it is pretty cute. I recently got a demo of the new Oracle Data Mining capabilities and will post a quick summary of it soon but this is new way to look at the capabilities.

    Obviously there are challenges with any kind of content creation on the iPad (most folks seem to really like it for content consumption, not so much for content creation due to lack of keyboard etc) but it does illustrate the possibilities inherent with using a compute cloud like's.

    Very cool Karl.
    ]]> Predictive Analytics Mon, 19 Apr 2010 18:44:46 -0700
    Telemetry means little without analytics UPS turns data analysis into big savings. What was interesting about this to me was not just the use of telemetry (which is interesting enough), but the way analytics acts as a massive value multiplier for this kind of data. In the article you see this great quotes:
    "There's millions and billions of pieces of information flooding out of the vehicle [at all times] into this dumb telematics device,"
    "Now you have to mine it," Levis said.
    Absolutely. Now you have to mine it. There is too much data, arriving too fast to put this on a dashboard or in a report and get real value from it. Instead you need to mine it for insight and operationalize this insight so it is actionable. Add the rules that make it actionable and then make the decisions, take the actions, that will add value to your business.

    While not every company has telemetry data, many have high volume data streams that seem to offer the potential for better business operations. But you have to mine it.
    ]]> Decision Management Mon, 05 Apr 2010 19:10:04 -0700
    An Overview of Data Mining Techniques Data Mining Techniques. I had forgotten how useful this was and thought I should re-post it for those of you looking for a nice online summary of core data mining techniques.
    ]]> Predictive Analytics Wed, 31 Mar 2010 18:38:07 -0700
    New Rexer Analytics Survey I recently got the survey results from the annual data mining survey that Karl Rexer of Rexer Analytics runs. You can get the summary here or the full results from Karl but here are my thoughts:

    • Data mining is everywhere. The most cited areas are CRM / Marketing and Financial Services with a big lead over Retail and Telecom. Healthcare did poorly, no surprise.
    • Data miners most frequently work in are Marketing & Sales, Research & Development, Risk.
    • Data miners' most commonly used algorithms are regression, decision trees, and cluster analysis - way ahead of the others. Text mining was back in the pack, interesting given the amount of text mining coming presentations we saw at Predictive Analytics World.
    • Half of data miners say their results are helping to drive operational processes. This is encouraging as I think this is by far the most effective way to use predictive analytics.
    • Batch scoring with the results getting stored in the database came top of deployment approaches at 30% with interactive real-time scoring at 21% and 16% putting the model into some overall software project.
    • 60% of respondents say the results of their modeling are deployed always or most of the time. This is still not good enough - nearly half are not getting deployed.
    • The top challenges facing data miners are dirty data, explaining data mining to others, and difficult access to data. However, in 2009 fewer data miners listed data quality and data access as challenges than in the previous year. 34% also have problems with IT.
    • Open-source tools Weka and R made substantial movement up data miner's tool rankings this year, and are now used by large numbers of both academic and for-profit data miners.
    There's lots more in the survey so go get it and read it.
    ]]> Predictive Analytics Fri, 26 Mar 2010 14:03:41 -0700
    Stephen Few on the problem with BI In a recent post, Big BI is Stuck: Illustrated by SAP BusinessObjects Explorer, Stephen Few took issue with the claims of SAP BusinessObjects Explorer. I have not spent any time with the product so I am not going to discuss his specific criticisms but I was struck by a caution he added in the post:

    Don't mistake what I've written as a case against Big BI in favor of Small BI. It is entirely possible for large BI vendors to provide effective tools for data sense-making [analytics]. To do this, they need to switch from a technology-centric engineering-focused approach to a human-centric design-focus approach, and base their efforts on a deep understanding of data sense-making. Most of the small BI vendors have done no better in cracking this nut than the big guys. They might be more agile due to their small size and thus able to bring a new product to market more quickly, but when they approach the problem in the same dysfunctional way as the big guys, they fail just as miserably. Just like politicians who sell themselves as "not like the guys in Washington," new players in the BI space often point to the failures of the big guys and then go on to do exactly the same. I am not making a case of small vs. big, but of clear-headed, informed, and effective vs. an old paradigm that doesn't work for the challenges of data sense-making.

    It seems to me that part of what Stephen is getting at here is a need to focus not on the technical capabilities but on the ability of a tool to support better decision-making. I see his post as pointing out a key reason I believe companies must begin with the decision in mind, figuring out what kinds of analytic insight will help improve a specific decision and drilling back into their data from there. In contrast, most companies today start with the data and go forward - and most BI tools (big BI, small BI, in-memory BI, SaaS BI) work this way too.

    ]]> Predictive Analytics Fri, 19 Mar 2010 14:37:29 -0700