Blog: James Taylor Subscribe to this blog's RSS feed!

James Taylor

I will use this blog to discuss business challenges and how technologies like analytics, optimization and business rules can meet those challenges.

About the author >

James is the CEO of Decision Management Solutions and works with clients to automate and improve the decisions underpinning their business. James is the leading expert in decision management and a passionate advocate of decisioning technologies business rules, predictive analytics and data mining. James helps companies develop smarter and more agile processes and systems and has more than 20 years of experience developing software and solutions for clients. He has led decision management efforts for leading companies in insurance, banking, health management and telecommunications. James is a regular keynote speaker and trainer and he wrote Smart (Enough) Systems (Prentice Hall, 2007) with Neil Raden. James is a faculty member of the International Institute for Analytics.

July 2009 Archives

David Vergara wrote a nice piece over on Target Marketing recently - Use effective segmentation with predictive analytics to personalize customer relationships. David does a nice job of outlining the steps involved in a segmentation modeling, a key area for data mining and predictive analytics.

To adopt analytics to personalize customer relationships, however, I think you need to go further. I think you need to understand the decisions you make that contribute to the relationship so you can personalize those decisions and, thus, the overall relationship. With an understanding of the decisions involved it will be clear how and where to apply the segmentation and analytic models you have developed - they will help you make better decisions.

To take personalized actions for a customer, you must make personalized decisions about those customers. You must decide if a particular customer will respond positively to this offer. You must decide how hard you want to try to retain a customer, or how flexible you should be in collections. You must decide if a cross-sell is appropriate given a customer's current state and concerns, and what offer you should use. You must decide that the tradeoff is worthwhile before you offer a new product to replace an existing one. You must identify the decisions that affect your customers and manage them with effective decision services.

Decision services contain rules that are defined by regulation, by policy, by expertise and by the customer's preferences. They must also contain segmentation models so that different kinds of customers can be treated differently and predictive analytic models that turn uncertainty about a customer's future behavior into usable probability. And they can use optimization technology to manage tradeoffs and ensure the best use of resources.

Using decision services to automate decisions in this way allows you to put predictive analytics and segmentation models to work. This is the topic of a white paper I wrote (available here) and an upcoming presentation and tutorial at Predictive Analytics World.


Posted July 27, 2009 6:02 AM
Permalink | No Comments |

Oz Analytics - The Darker Side Of Analytics was an interested little post discussing the risk of using analytics to, in this case, to profile potential criminals based on past behavior. The use of analytics to predict crime and criminals is certainly growing and, as Steve said in his post, you have to

wonder how many times their 'digital techniques' will create false positives and (presumably) false information being sent out?

In my opinion the question is not whether analytics have such risk - clearly they do - but whether the use of analytics increases or decreases the risk of a false positive.

Without analytics, this kind of proactive policing (essential not just to stopping pedophile tourists but also to catching terrorists, for instance) relies on human judgment. Humans, unlike analytics, are prone to prejudices and personal biases. They judge people too much by how they look (stopping the Indian with a beard for instance) and not enough by behavior (stopping the white guy who is nervously fiddling with his shoes say). They tend to be driven by recent results to the exclusion of ones further in the past and much more (see this post on decision making traps). If we bring analytics to bear on a problem the question should be does it eliminate more biases and bad decision making than it creates new false positives. Over and over again studies show analytics do better in this regard (check out some great examples in Super Crunchers). So, personally, I think analytics are ethically neutral and the risk of something going "to the dark side" is the risk that comes from the people involved, with or without analytics.

This post was found through Smart Data Collective syndication.


Posted July 20, 2009 12:38 PM
Permalink | No Comments |

I regularly post reviews of new products over on my main decision management blog. Sometimes I cross-post them here - like today with River Logic's Enterprise Optimizer.

River Logic's Enterprise Optimizer is what is increasingly known as an "Integrated Business Planning" solution. Enterprise Optimizer is designed to manage cross-functional decisions at strategic, tactical, and policy levels considering all the elements and consequences of those decisions. The models you build allow you to see the financial and operational impact of those decisions and then optimize them.

Enterprise Optimizer comes from work done by the University of Massachusetts with mathematicians from the Russian Academy of Science. The group had some background in AI, focused around trying to capture expert know-how to improve operational processes. From this research they moved to financial modeling, and over the last 15 years or so, have modeled over 200 different problems in various industries working with a range of partners.

The product that has evolved from this has recently been labeled by Gartner as an Integrated Business Planning (IBP) tool. IBP is defined as a collection of technologies, applications, and processes that connect planning functions across the enterprise to improve organizational alignment and financial performance. The technologies help companies understand, communicate, and manage constraints and consequences across the whole enterprise. The idea is not to just roll up numbers and pass them on but to have a more dynamic model of the connections. Unsurprisingly, they do a fair amount of work with companies adopting the Beyond Budgeting Round Table model.

The requirements for IBP include explicit process mapping (how a company creates value); financial modeling (ROI and forward-looking, activity-based cost, P&L ,and marginal opportunity analysis - all considering process constraints); a holistic view (products, customers, resources, supply chain processes, partners, etc.); and extensive optimization and business rules capabilities (objective function, rules, constraints, etc.). Plus collaboration, integration, and monitoring.

While Enterprise Optimizer is a horizontal technology, River Logic is focused on delivering EO-based solutions in a couple of areas, especially Consumer Packaged Goods with Healthcare as a secondary market.  For example, CPG solutions include strategy modeling (product portfolio, capital planning/network design), policy (inventory policy/product segmentation, sourcing, planning frequency), S&OP (executive, master planning, production planning, etc.), customer profitability, and cost to serve.

The product itself has a simple diagram style interface used to create the business processes that drive value in an organization. These diagrams model the supply chain and show how things like trade promotions impact volume, distribution and financial performance. Tactical planning solutions are constrained by policy, financial, and regulatory constraints from working capital to carbon emissions. The models also report forward-looking costs (akin to ABC costs but projected forward considering the constraints of the business), P&L, balance sheet roll-up, cash flow etc. Enterprise Optimizer models processes and more, but it doesn't execute them - the model is just built and the engine figures out what the constraints and cost-drivers are.

The basic approach can be illustrated by considering a simple Purchase-Inventory-Conversion-Inventory-Sales process. The PICIS model is very common in manufacturing organizations - they buy raw materials (Purchase) that creates Inventory which is then manufactured (Conversion) into finished goods (Inventory) that must be sold (Sales). EO lets you easily create a process with a basic set of nodes, one for each step. EO will translate this model into a set of mathematical representations and run analyses against these nodes. Each node has a different representation and the user can specify different kinds of information for each node type. When the model is executed additional information is created on each node - the engine calculates things like opportunity value (e.g., the marginal profit from one more item or an additional customer) or optimal production schedules. Lots of information is defaulted, based on extensive research, so the model can be run quickly once basic information is filled in - users, of course, find it easier to edit a model once they can see what it does. As the user adds more information, the model becomes more constrained and more accurate and the tool is designed to support a highly iterative style of working.

The basic nodes support different elements of the business:

  • Purchase nodes allow the price and constraints (min or max units available per period etc) to be specified for a user-defined list of raw materials. Once the model has been executed the node displays things like opportunity value (profit from getting one more unit of an item).
  • Conversion nodes can specify different machines or resources that convert raw materials into finished goods. For each resource the user can specify their characteristics such as labor rate, fixed costs, period, and work units per period etc. Conversion nodes also can contain the processes that run on the resources. Process costs, rates, setup costs, etc. can all be specified. Various forms of cost analysis, activity-based costing, and throughput accounting are supported.
  • Inventory nodes can specify the various products or materials and their price, etc. A flow from a conversion node to an inventory node allows you to map materials to the processes that produce them. The flow from raw materials inventory to resources allows you to specify the BOM or recipe for the various products.
  • Sales nodes let you specify various constraints on sales, model price elasticity with non-linear constraints, etc.

There's more, with each node supporting a potentially very large amount of information about the step, how it operates, and its financial implications. A fifth node type, financial report, can be added and mapped to a series of financial reports. The financial models can be specified in detail, but there is a lot of useful defaulting built in based on research with PriceWaterhouseCoopers.

Once a minimum amount of information is specified behind the nodes the engine can then be used to create a model of the business based on the specification. EO will optimize for profit on any unconstrained variable. Options to do detailed unit costs analysis and other kinds of analysis exist and can be added to the model run. Running the model updates the model, with implied attributes and optimized values, and these can then be updated as necessary by the user. EO also allows the models and constraints to be extended so that companies can model non-financial, non-process constraints, and measures like the number of truck trips through residential neighborhoods per day, special company measures, etc.

River Logic is also building an "IBP ecosystem" to make it easier for companies in the CPG and Healthcare spaces (initially) to deploy IBP solutions.

Excel, of course, is the major alternative and it is typically augmented by EO. Most EO users are doing what-if analysis and scenario planning - not real-time/workflow-oriented day to day optimization such as that done by folks using CPLEX, Dash, or Dynadec. The integrated financials and built-in; accounting best practices are, to my mind, the key differentiator, though EO also has the ability to compare scenarios using a web-based scenario management tool that allows users to name, store, retrieve, and compare entire models side by side.


Posted July 16, 2009 8:53 AM
Permalink | No Comments |
John Elder of Elder Research is well known in data mining circles and speaks/teaches regularly. Not only has John recently released a new book (Handbook of Statistical Analysis and Data Mining Applications), he has now released his great seminar on the top 10 mistakes in data mining on YouTube! Highly recommended for anyone working with data mining (or data miners)
Check out:
Part 1: Top 10 data mining mistakes
Part 2: Don't rely on only one technique
Part 3: Don't extrapolate
Part 4: The path to data mining success


Posted July 9, 2009 11:23 PM
Permalink | No Comments |

The Evidence Based Management blog had a post on Why experts are so often wrong that discusses a book by Philip Tetlock (Expert Political Judgment: How Good Is It? How Can We Know?)

In a world filled with expert predictions that are mostly incorrect, and filled with people who eagerly seek such predictions even though they are incorrect, Tetlock's book explores why experts are so often wrong and why we listen to them anyway. There is no more evidence-based subject matter than forecasting. This book provides an excellent overview of the perils and pitfalls in making forecasts.

But if we can't rely on experts to synthesize information and pass on judgments to us, can we make our own? Perhaps not, according to another post, this time from the Institute for the Future. In So much information, such limited ability to understand it all Vivian Distler quotes Stephan Dahl:

we make the assumption that they will be able to keep up with and synthesize the abundance of information that may be relevant to their health. 21% of adult Americans have only rudimentary skills, leaving them unable to extract even simple information from printed material. A further 25% can perform simple reading functions but "cannot integrate or synthesize several facts" from documents.

and she goes on to ask

Information will have to be made accessible and understandable.  Are we ready to take on that obligation?

Personally I don't think it is the information that needs to be made accessible and understandable, but the decisions that must be made with that information.We should not allow experts to make judgments without process (as I discussed here) and we cannot rely on consumers (or front line staff) to have the necessary analytic skills. Instead we should focus our experts on understanding how we might make a good decision and build that expertise (those rules) into a decison making system that also allows those impacted to add constraints or additional rules. These rules could take advantage of sophisticated analytics without just handing over decision-making power to the analytic models.

The end result could and should be a transparent system that makes decisions consistently based on the data and on robust analysis of that data using rules that come from regulations, policies, expertise and personal preference. An action support system not just a decision support system. Just a thought...


Posted July 6, 2009 7:30 AM
Permalink | No Comments |
PREV 1 2