We use cookies and other similar technologies (Cookies) to enhance your experience and to provide you with relevant content and ads. By using our website, you are agreeing to the use of Cookies. You can change your settings at any time. Cookie Policy.

Blog: James Taylor Subscribe to this blog's RSS feed!

James Taylor

I will use this blog to discuss business challenges and how technologies like analytics, optimization and business rules can meet those challenges.

About the author >

James is the CEO of Decision Management Solutions and works with clients to automate and improve the decisions underpinning their business. James is the leading expert in decision management and a passionate advocate of decisioning technologies business rules, predictive analytics and data mining. James helps companies develop smarter and more agile processes and systems and has more than 20 years of experience developing software and solutions for clients. He has led decision management efforts for leading companies in insurance, banking, health management and telecommunications. James is a regular keynote speaker and trainer and he wrote Smart (Enough) Systems (Prentice Hall, 2007) with Neil Raden. James is a faculty member of the International Institute for Analytics.

Recently in Optimization Category

A little while back I got to spend a few minutes talking about analytics and optimization with Jack Mason of IBM. He posted the resulting video over on the Smarter Planet blog. Enjoy.

Posted October 19, 2009 5:25 AM
Permalink | No Comments |

I was attending IBM's launch of its analytic appliances when it announced its intent to acquire SPSS. I did not get a chance to write much more at the time but I did not want to let the opportunity pass completely.I think the announcement represents a sea change in the decision management and analytics markets.

First it helps IBM deliver the predictive analytic element of its business analytics and optimization story. SPSS has many years of experience in productizing analytic R&D giving IBM a platform for bringing its own extensive analytic R&D effort to market. The potential for new capabilities based on IBM Research combined with new channels for PASW Modeler (Clementine as was) should be great for the analytics market as a whole.

SPSS has been a distant number 2 to SAS in the data mining/predictive analytics space for many years. While no-one, including me, expects acquisition by IBM to drive them past SAS it does represent a huge opportunity. The new channels for SPSS' products created by its acquisition by IBM include IBM's worldwide sales force, obviously, as well as its extensive network of partners. The focus of IBM on strategic relationships with customers will, I think, be particularly valuable to SPSS which has a history of selling direct to analysts/modelers. While this direct-to-user approach results in lots of customers, it does not establish SPSS as "strategic" or establish a broad commitment to using the SPSS modeling tools. IBM is more likely to drive this kind of adoption. Not only is this good for the SPSS product lines, it is good for data mining and predictive analytics more generally as I think it will raise the profile of modeling in companies.

Beyond analytics, though, the more interesting aspect is the potential for IBM to put together a complete decision management platform. Having the IBM platform support decision management as well as process management, event management and information management would be huge. And while acquiring all the pieces does not automatically give IBM such a platform they have historically worked to integrate their acquisitions fairly rapidly. I'll blog more about this over on JTonEDM later in the week.

And when we take the two announcements (analytic appliances and SPSS together) we have some interesting business implications. The Smart Analytics Systems represent another potentially powerful channel for SPSS. The Statistics modules were already going to be on some of these as part of the Cognos installs but the deployment products that SPSS has are ideally suited to this kind of appliance-based deployment. SPSS' decision management products - PASW Deploy for risk decisions, marketing campaign decisions and inbound communication decisions - package up models and rules for advanced decision making. Thanks to the simple interfaces decision services like this have these products are ideal for use in appliances. Putting these, and potentially other, decision-centric products on the Smart Analytics Systems will move them forward nicely.

Finally there is IBM's services business. Some months back IBM announced a new service line in Global Business Service - Business Analytics and Optimization. While Cognos and ILOG's optimization capabilities combined with various offerings from inside IBM R&D offered most of the software support this service line needs, the lack of an IBM branded data mining/predictive analytics offering was glaring. Adding SPSS now gives IBM's BAO service line all the tools it needs. Not only will that help BAO, it will drive SPSS into more IBM accounts.

As I said at the time, the importance of IBM's BAO service line should not be understated. Today high-end analytic solutions still require a significant amount of domain expertise and technical integration as well as multiple products. While I expect that to change, and the PASW Deploy products are an example of the kind of packaging that is required going forward, the ability of 4,000 IBM consultants to deliver more advanced analytics solutions is critical to increasing adoption and awareness around the world.

For more on this consider James Governor's post on IBM and SPSS, Forrester's report on the acquisition, Neil Raden's post on IBM's vision for analytics or Merv Adrian's post on IBM's move into predictive analytics.

Posted August 24, 2009 6:52 PM
Permalink | No Comments |

I regularly post reviews of new products over on my main decision management blog. Sometimes I cross-post them here - like today with River Logic's Enterprise Optimizer.

River Logic's Enterprise Optimizer is what is increasingly known as an "Integrated Business Planning" solution. Enterprise Optimizer is designed to manage cross-functional decisions at strategic, tactical, and policy levels considering all the elements and consequences of those decisions. The models you build allow you to see the financial and operational impact of those decisions and then optimize them.

Enterprise Optimizer comes from work done by the University of Massachusetts with mathematicians from the Russian Academy of Science. The group had some background in AI, focused around trying to capture expert know-how to improve operational processes. From this research they moved to financial modeling, and over the last 15 years or so, have modeled over 200 different problems in various industries working with a range of partners.

The product that has evolved from this has recently been labeled by Gartner as an Integrated Business Planning (IBP) tool. IBP is defined as a collection of technologies, applications, and processes that connect planning functions across the enterprise to improve organizational alignment and financial performance. The technologies help companies understand, communicate, and manage constraints and consequences across the whole enterprise. The idea is not to just roll up numbers and pass them on but to have a more dynamic model of the connections. Unsurprisingly, they do a fair amount of work with companies adopting the Beyond Budgeting Round Table model.

The requirements for IBP include explicit process mapping (how a company creates value); financial modeling (ROI and forward-looking, activity-based cost, P&L ,and marginal opportunity analysis - all considering process constraints); a holistic view (products, customers, resources, supply chain processes, partners, etc.); and extensive optimization and business rules capabilities (objective function, rules, constraints, etc.). Plus collaboration, integration, and monitoring.

While Enterprise Optimizer is a horizontal technology, River Logic is focused on delivering EO-based solutions in a couple of areas, especially Consumer Packaged Goods with Healthcare as a secondary market.  For example, CPG solutions include strategy modeling (product portfolio, capital planning/network design), policy (inventory policy/product segmentation, sourcing, planning frequency), S&OP (executive, master planning, production planning, etc.), customer profitability, and cost to serve.

The product itself has a simple diagram style interface used to create the business processes that drive value in an organization. These diagrams model the supply chain and show how things like trade promotions impact volume, distribution and financial performance. Tactical planning solutions are constrained by policy, financial, and regulatory constraints from working capital to carbon emissions. The models also report forward-looking costs (akin to ABC costs but projected forward considering the constraints of the business), P&L, balance sheet roll-up, cash flow etc. Enterprise Optimizer models processes and more, but it doesn't execute them - the model is just built and the engine figures out what the constraints and cost-drivers are.

The basic approach can be illustrated by considering a simple Purchase-Inventory-Conversion-Inventory-Sales process. The PICIS model is very common in manufacturing organizations - they buy raw materials (Purchase) that creates Inventory which is then manufactured (Conversion) into finished goods (Inventory) that must be sold (Sales). EO lets you easily create a process with a basic set of nodes, one for each step. EO will translate this model into a set of mathematical representations and run analyses against these nodes. Each node has a different representation and the user can specify different kinds of information for each node type. When the model is executed additional information is created on each node - the engine calculates things like opportunity value (e.g., the marginal profit from one more item or an additional customer) or optimal production schedules. Lots of information is defaulted, based on extensive research, so the model can be run quickly once basic information is filled in - users, of course, find it easier to edit a model once they can see what it does. As the user adds more information, the model becomes more constrained and more accurate and the tool is designed to support a highly iterative style of working.

The basic nodes support different elements of the business:

  • Purchase nodes allow the price and constraints (min or max units available per period etc) to be specified for a user-defined list of raw materials. Once the model has been executed the node displays things like opportunity value (profit from getting one more unit of an item).
  • Conversion nodes can specify different machines or resources that convert raw materials into finished goods. For each resource the user can specify their characteristics such as labor rate, fixed costs, period, and work units per period etc. Conversion nodes also can contain the processes that run on the resources. Process costs, rates, setup costs, etc. can all be specified. Various forms of cost analysis, activity-based costing, and throughput accounting are supported.
  • Inventory nodes can specify the various products or materials and their price, etc. A flow from a conversion node to an inventory node allows you to map materials to the processes that produce them. The flow from raw materials inventory to resources allows you to specify the BOM or recipe for the various products.
  • Sales nodes let you specify various constraints on sales, model price elasticity with non-linear constraints, etc.

There's more, with each node supporting a potentially very large amount of information about the step, how it operates, and its financial implications. A fifth node type, financial report, can be added and mapped to a series of financial reports. The financial models can be specified in detail, but there is a lot of useful defaulting built in based on research with PriceWaterhouseCoopers.

Once a minimum amount of information is specified behind the nodes the engine can then be used to create a model of the business based on the specification. EO will optimize for profit on any unconstrained variable. Options to do detailed unit costs analysis and other kinds of analysis exist and can be added to the model run. Running the model updates the model, with implied attributes and optimized values, and these can then be updated as necessary by the user. EO also allows the models and constraints to be extended so that companies can model non-financial, non-process constraints, and measures like the number of truck trips through residential neighborhoods per day, special company measures, etc.

River Logic is also building an "IBP ecosystem" to make it easier for companies in the CPG and Healthcare spaces (initially) to deploy IBP solutions.

Excel, of course, is the major alternative and it is typically augmented by EO. Most EO users are doing what-if analysis and scenario planning - not real-time/workflow-oriented day to day optimization such as that done by folks using CPLEX, Dash, or Dynadec. The integrated financials and built-in; accounting best practices are, to my mind, the key differentiator, though EO also has the ability to compare scenarios using a web-based scenario management tool that allows users to name, store, retrieve, and compare entire models side by side.

Posted July 16, 2009 8:53 AM
Permalink | No Comments |