Blog: James Taylor Subscribe to this blog's RSS feed!

James Taylor

I will use this blog to discuss business challenges and how technologies like analytics, optimization and business rules can meet those challenges.

About the author >

James is the CEO of Decision Management Solutions and works with clients to automate and improve the decisions underpinning their business. James is the leading expert in decision management and a passionate advocate of decisioning technologies business rules, predictive analytics and data mining. James helps companies develop smarter and more agile processes and systems and has more than 20 years of experience developing software and solutions for clients. He has led decision management efforts for leading companies in insurance, banking, health management and telecommunications. James is a regular keynote speaker and trainer and he wrote Smart (Enough) Systems (Prentice Hall, 2007) with Neil Raden. James is a faculty member of the International Institute for Analytics.

September 2008 Archives

Copyright © 2008 James Taylor. Visit the original article at New Community for Fair Isaac customers and others.

The folks at Fair Isaac pointed me to a new community they have just released - dmtools.fairisaac.com. I have not had much of a chance to check it out but it looks useful and I look forward to participating. One thing is new - you can download trial versions of Blaze Advisor. Have fun…

ShareThis


Posted September 30, 2008 3:15 PM
Permalink | No Comments |
Copyright © 2008 James Taylor. Visit the original article at Repositories, processes, decisions and more.

Bruce Silver had an interesting article recently on The Next Innovation in BPMS in which he discusses the need for repository capabilities in BPM. Bruce makes the point that “next generation” repositories for process management must not only support process models, they must also support “decision models”, business object definitions, performance measurement information and service metadata for deployed services.

Today most of these things are in separate repositories and, frankly, I am not sure this is going to change. For one thing the challenge of a single repository handling everything is great and past experience with “the big repository in the sky” is that it cannot be made to work. Specialization is just too valuable for some of the pieces/people involved. As such one ends up with a federated repository with multiple specific repositories each able to query and link to others.

For this to work I think there are a couple of key features:

  • Strong query APIs for all so that one repository can use dynamic queries to access others
  • Pre-defined impact analysis so that changes can be analyzed for their impact across all the repositories
  • Extensibility so that a repository can be extended to have hooks to another for explicitly linking objects in one to objects in another
  • Standards support
  • User interface elements suitable for mashup/portal use so that a user can build the view they need

ShareThis


Posted September 30, 2008 3:35 AM
Permalink | No Comments |
Copyright © 2008 James Taylor. Visit the original article at Franchises, localization and decision management.

I live in Palo Alto and a new Mountain Mike’s Pizza has just opened up near us. Much as we like MM pizza we have two problems - we like wholewheat dough and, as several members of my family are lactose/milk intolerant, soy cheese. If you have visited or live in Palo Alto you will be thinking to yourself “typical Palo Alto residents” - this is very much a soy-cheese-and-wholewheat-dough-pizza kind of place. However we cannot get either at our local MM because the parent company does not offer it

It is common for folks to criticize global brands and franchising organizations for having this one size fits all approach. Of course the most successful proponent of franchising ever, McDonalds, does tailor its menu to suit local tastes. But what, I hear you ask, does this have to do with decision management?

Well my experience with MM is like most customers experience with the companies with whom they do business - one size fits all. Whatever the company things is good is what customers can have. The preferences or desires of each customer, or even of a customer segment, are of little or no importance. Learning not from MM but from McDonalds would push in the opposite direction - companies would think about how they could tailor their products, pricing, offers or marketing to suit. And suit individuals or micro-segments not just regions. Decision management - especially the management of these micro decisions - is key. If I regard the decision “what products should I offer this customer” as a customer-by-customer decision (a micro decision) then I need to manage it much more precisely and at a much more fine-grained level than if I treat it as a “once and done” decision to be applied to everyone.

Mountain Mike’s is living in the mass production industrialization of the past. McDonalds is living in the localized industrialization of the present. Decision management is what it will take for most companies to move to the post-industrial mass customization of the future.

ShareThis


Posted September 25, 2008 3:33 PM
Permalink | No Comments |
Copyright © 2008 James Taylor. Visit the original article at Finding hidden decisions in business processes.

Scott Sehlhorst (with whom I have presented and about whom I have written before) had a great post this week called Hidden Business Rule Example. Scott walks through some analysis of a process and shows how finding hidden decisions within that process can really inform how you think about the systems and processes you need. This is similar to a service that we offer and that Neil and I call the Decision Discovery Process. I liked Scott’s example a lot as the decision to do something 100% (or 0%) of the time is a classic hidden decision - people don’t see it as they always/never do something.

I agree with Scott that analysis of the customers who abandon the process would be a great way to find out where to priortize your effort and I would add that if there are multiple ways to make the decision and it is not clear which one will work best then it might be worth using adaptive control to test multiple approaches and track which one is most successful in generating revenue.

A great post from Scott and well worth the read.

ShareThis


Posted September 24, 2008 4:47 PM
Permalink | No Comments |
Copyright © 2008 James Taylor. Visit the original article at Collections Best Practices.

Jeff Bernstein of Strategem Portfolio Services gave an overview of the latest developments in collections. Jeff’s company has a product called Strategy Director (about which I blogged before). Jeff does a lot of work with collections groups and all too often sees a failure to implement analytics even where those analytic models are being developed for collections. There is thus a need for both a technology platform for analytics but also an educational one so that models will be used once they are done.

Collections has been reshaped by a combination of increased expectations and a globally competitive market in the last decade. Collections has been evolving in multiple dimensions:

  • Skills metrics are essential not just portfolio results that collectors may or may not control. Unless people feel that skills and talent get them rewarded, rather than results driven by the portfolio quality, then it is hard to change behavior. Leadership, management and skill building are increasingly important.
  • Cost efficiency was the drive for a lot of outsourcing and offshoring as well as the more recent push for alternative contact channels. Now a more holistic approach to tracking results and costs is more common.
  • A risk based focus is now more or less the basis for collections work. Pooling by degree of risk and segmentation by risk are the norm. New systems allow this to drive very consistent outcomes as rules-driven processes, voice response systems and more are increasingly common. Operational negation - having the business do something that contradicts the analytically developed approach - is on the decline having been epidemic in the past.
  • Quality management is now the norm with best in class organizations using recordings etc to review performance and target improvements.

Progressive organizations are taking a number of steps:

  • Management Systems Approach
    Top to bottom performance visibility with skills-based metrics being widely used and often even visible to everyone. Leadership focus on individual development and quality monitoring is continuous.
  • Offshoring/Outsourcing
    Portfolio risk strategies now drive outsource strategies - which segments of the portfolio show the best cost/benefit for outsourcing? Rather than using outsourcing as an overflow it is now a more integrated component with a particular portfolio focus. Increasingly a part of the company not a separate one and increasingly informed by analytically-derived strategies.
  • Alternative Channels
    Channels are now integrated not bolted-on. Analytics, like impact modeling, again help find the folks who are likely to be handled well by a specific channel. Interactive voice response systems can be included in very sophisticated approaches thanks to their support for rules and scripting as can SMS and web channels. IVR, SMS or web might be exclusive for some customers and one of many for others driven by preferences and analytic measures of effectiveness. Optimizing contacts across these multiple channels, and across offshore or outsourced groups, uses this array of choices to maximize results.
  • Real-time data
    Progressive organizations are using recent and even real-time data to drive better conversations and to avoid repeating failed strategies. Using trigger events to requeue and reprioritize/allocate acconts is critical to improving results.
  • Quick response to evolving risk
    Using internal and external scores and tracking of non-delinquent accounts to find early signs of problems and building analytics that can help find and treat these accounts before they go wrong is very productive. Combined with more real-time data and a more holistic view of potential treatment channels this can be particularly effective.

The end result is true Lifecycle Risk Management in Collections. Using micro-segmentation of early stage accounts and event triggers for rapid response; rules-based processing that use all the available channels; and well designed skills-based performance metrics. Risk-driven, skills-based routing to the right channel, the right approach, dynamically adjusted.

ShareThis


Posted September 23, 2008 8:53 PM
Permalink | No Comments |
Copyright © 2008 James Taylor. Visit the original article at From Scores to Strategies.

The use of analytics in business decisions, presented by one of InfoCentricity’s customers, was next. In many organizations modelers are busy building predictive models that they then throw over the wall to a business analyst. To bridge this gap you need a collaboration platform that allows modelers to do their thing while allowing business analysts to do theirs. Xeno supports this kind of collaboration. In general this collaboration makes a difference in a number of areas:

  • The population of interest
    Scorecards tend to exclude those with little history, pre-defined treatments or segments where history is unreliable. On the business side you tend to have hard policy rules that affect treatments. Collaboration on the “big picture” of the population can affect scorecard design, explain data anomalies and allow both sides to re-examine the population with respect to business goals.
  • Performance outcomes
    Understanding how long a model / strategy will be in production and what it is going to be used for help build the right scorecard. If a scorecard is going to be used for a long time while the strategy changes often then a more generic scorecard may be more useful, for instance.
  • Predictors or decision keys
    How do you decide if something is a segment variable or a predictor? Modelers think about statistical measures where business analysts think about policies, customer service impact, business performance. Worth some back and forth between policy, data sources and scorecard. “Hard” rules first, then the softer policies then refinement based on analytics, for instance.  Knowing how sophisticated the strategy is likely to be will also prevent over-development of scorecards.

The customer came next and presented on the specifics of how to review policies analytically to streamline the loan review process in originations. Policies come from bad loan review (what went wrong review that generates rules for next time), history and domain expertise. Modelers, meanwhile, can use standard application data, credit variables and policy variables. They also needed to infer the performance of rejected and unused accounts (reject inference) so that this could be used in the model. They mimiced the application flow as a tree and then got modelers and business people to work very interactively to see what existing policies did, try different scenarios and test other variables to see what might be worth including in policies. This helped them find:

  • Unproductive policies
    Why review those with low acceptance rates/high bad rates or high acceptance rates/low bad rates?
  • Policies that should be added
    Medium bad rates might respond to further splitting to divided high from low bad rates
  • Places where more precision would be helpful
    Existing splits that have similar bad rates for instance could be replaced with splits that work better

In general this approach led to many small changes that had a sizable impact while also increasing the confidence in the automated decision making. Origination takes more than scores, it takes policy rules too. Reviewing these rules analytically makes for better efficiency and more validated changes.

ShareThis


Posted September 23, 2008 7:07 PM
Permalink | No Comments |
Copyright © 2008 James Taylor. Visit the original article at Putting Analytics to Work.

Here’s my presentation from the InfoCentricity User Exchange. Enjoy.

ShareThis


Posted September 22, 2008 11:15 PM
Permalink | No Comments |
Copyright © 2008 James Taylor. Visit the original article at Scorecard Development Efficiencies with Xeno.

Sue Gonella presented on some efficiencies in building predictive scorecards. In particular she covered the  use of sampling data vs using all records into a model development exercise.

Rather than using all records she advocated using stratified random sampling where a sample of each group of interest is used to build and validate the models. This works better because turn-around times are better and experimentation easier. She demonstrated that predictive power is comparable if you use 10,000 records or so per performance group so there is no loss of accuracy if this is done right.

She walked through an example of this showing that for the same model performance she could save more than 99% of the time involved. This enabled a lot more experimentation as most changes to the model made little or no difference to the time taken when 10,000 sample records are being manipulated (whereas the same changes would have caused the full dataset to run even slower). Similarly demoting predictors that make zero contribution so that they don’t affect subsequent iterations makes for even better performance with little or no impact on predictive power.

Clearly taking these steps - stratified random sampling and the elimination of zero-contribution predictors - make for MUCH faster iteration in model development and thus better models. She also pointed out that, even if you are required to use all records in the final model, you can do a lot of the development work with the sample data to improve performance and so allow many more iterations.

ShareThis


Posted September 22, 2008 10:00 PM
Permalink | No Comments |
Copyright © 2008 James Taylor. Visit the original article at Marketing and Customer Segmentation with Xeno.

Delivering the best value proposition using segmentation is a multi-step journey with 6 main steps and some critical differences from other analytic approaches:

  1. Define Segmentation Objectives
    The first step - deciding why to build a segmentation scheme - is important but often overlooked. Reasons may include declining financial performance, changes in strategy or market trends - the current segmentation just does not meet the company’s goals.
  2. Conceptualize Segments
    Not using analytics yet - just using the overarching business objectives to identify the most useful segmentation approach. Time to engage domain experts to discuss possible approaches and come up with a few possible approaches that might work like demographic or geographic or customer value-based.
  3. Gather Data
    Data comes from lots of sources and once you pull it together and analyze it you must expect a feedback loop that changes the data you have or want. Constant improvement is the order of the day. You can’t build segmentation with data you don’t have. You need to also be sure the data is going to continue to be available and that it will be available at production. Data comes from all channels and might include transactional, behavior, demographic and attitudinal data. Consider complaints, customer service calls and more for attitudinal data. Multi-channel data matters as customers engaged in multiple channels tend to be much more loyal, for instance.
  4. Apply Analytics to Develop Segments
    Decision trees, regression and clustering are always the most popular techniques when data miners are polled. This is a very iterative process for clustering, just as it is for building trees or scorecards. In many ways clustering takes the same steps for loading data, exploring it and transforming variables for use. The challenge is reducing your candidate variables to a list that will drive good clustering and this is where the skill of a modeler and the engagement of business users are critical. Developing the segments, profiling them and getting the segmentation code at the end is also pretty familiar to most analysts. Xeno supports the whole process with a number of specific features like variable selection metrics, outlier handling, discrete and continuous variables and more.
  5. Develop Marketing Applications for Segments
    Name segments and profile them with characteristics. Size them, prioritize them and create a value proposition for each. Prioritization involves considering the size of the segment, the value of the segment (size, revenue, growth, profitability) and deployment potential (how compelling a value proposition, suitable deployment resources). Figure out what changes you need to your operational processes, resources etc. You should also consider building personas (from Alan Cooper’s work) for these segments (I blogged about using analytics and personas together before). Examples of companies using personas include Dell, Fedex and Best Buy.
  6. Build Deployment Strategies
    Last step is identify key marketing levers for segments and create unique value propositions/programs for each. Roll them out and track, track, track.

Flora also walked through some case studies. First was a services company offering residential services across multiple brands. Lots of data but challenged to cross-sell and up-sell. Using clustering to find customer segments and developing personas to give color to these segments helps clarify the motivators and potential home service purchases for each. Segments were based on income, home ownership and length of time in residence. Segments included “snug as a bug” families and “old timer” retired couples. A second case study was focused on collections and had segments like “struggling”, “don’t bother me” and “credit users” depending on internal balance, internal risk and external risk.

ShareThis


Posted September 22, 2008 6:40 PM
Permalink | No Comments |
Copyright © 2008 James Taylor. Visit the original article at Impact Modeling and Maximizing Marketing Return.

Nina Shikaloff discussed an analytics technique that I had not heard of - Impact Modeling. Impact Modeling is a decision modeling technique. Decisions on acquiring customers - what to offer for instance - managing customers and handling difficult customers are all important and it can be tricky to identify better ones. Impact modeling is about explicitly measuring the incremental financial impact of a strategy - how much more will I make if I do ‘B’ rather than ‘A’ - and then mapping segments of customers to the optimal decision.

However you can’t test any particular customer with both strategies to compare results so instead you use adaptive control (A/B testing) to try A on 90% (say) and B on the other 10% while tracking the results. The Impact Modeling algorithm then searches through the results to see which segments respond better to which strategies. Essentially it uses the results to find segments where one particular strategy works better and keeps driving down into the details of these segments to find more and more fine-grained ones where one approach or the other works better. The outcome is a decision tree or a simple ruleset that picks one of the strategies for each segment - very deployable. It is also easy to simulate the impact of the approach allowing you to maximize the financial impact.

Impact Modeling can be used when tracking multiple financial objectives and can be constrained by competing objectives (risk v revenue, for instance). It can also be extended to more than 2 choices and can be used on relatively small samples.

Nina illustrated the power of Impact Modeling with a couple of case studies. The first was a credit card issuer trying to find the right APR increase that would boost revenue without increasing risk or attrition. They found that half the accounts should get an APR increase (some small, some larger) while the other half should not to maximize results. Each strategy was applied to multiple segments and one of the interesting effects of Impact Modeling is this understanding of the segments. The second was another credit card issuer with a very diverse target group and learning which sub-segments responded to the two offers was very informative. Not only did Impact Modeling get better results, the user learned a lot too.

Given the outcome is a decision tree it may seem like Impact Modeling is the same as normal decision tree modeling. Impact Modeling is essentially an analytic technique for finding the right rules because it analytically finds the right tree nodes, considers the impact of prior decisions and allows multiple objectives to be considered. Personally I really like these kinds of analytic techniques as they are so clean to deploy, allowing them to be put into production rapidly. This is something that I will touch one when I speak this afternoon.

ShareThis


Posted September 22, 2008 4:50 PM
Permalink | No Comments |
Copyright © 2008 James Taylor. Visit the original article at Live from the InfoCentricity User Exchange.

Today and tomorrow I am attending the 4th annual InfoCentricity User Exchange. I got an overview of their Xeno product some time ago (blogged here) and I am looking forward to learning more about what their customers do with the product. All the attendees have our book too so that should be fun.

First up is Chris Frothinger, CEO with some opening remarks about the importance of the user exchange to InfoCentricity. He began by saying that InfoCentricity is doing well again this year, despite the tough economic conditions. Chris had a great phrase for the times - “data-driven not deal-driven”. Companies making data-driven decisions will do better in these difficult times and those are the kind of companies who are InfoCentricity customers. Customers attending come from auto finance, retail, marketing, cards and more.

ShareThis


Posted September 22, 2008 4:19 PM
Permalink | No Comments |
Copyright © 2008 James Taylor. Visit the original article at Segmentation and product design.

Scott had a great article on segmentation and personas this week that is a nice, quick read. I think the use of analytics in persona design can make a big difference (as I have noted before) and that decision management can use good customer segmentation as a first step towards extreme personalization.

If you are not already segmenting and analyzing your customers, you should be.

ShareThis


Posted September 18, 2008 3:38 PM
Permalink | No Comments |
Copyright © 2008 James Taylor. Visit the original article at Making decisions about loyalty programs.

1:1 had a nice piece on the growing role of loyalty programs in retail. This noted the “Growing sophistication in loyalty programs” among retailers and, in particular, the use of loyalty program data not just to calculate lifetime customer value but also to build competitive advantage. This second aspect is the one I always find compelling. If you can use data about the behavior of your customers to see how loyal, profitable, expensive, new, longstanding or other kinds of customers behave then you can build better models and make way better decisions. As a longstanding promoter of intense personalization and consistency across channels I was particularly pleased to see that more than half the responders were using customer loyalty data for “elements that suit specific customer affinity and preference” (53 percent) and “personalized promotions across channels” (52 percent).

Besides recommending strong customer ownership, the study suggested two particular areas where retailers should focus. The first is on customer reactivation, the second on multichannel loyalty campaigns. Mapping these to decision management and we get the following advice:

  • Make loyalty offer decisions explicit so they can be consistent across channels
  • Make loyalty offer decisions explicit so they can consider the value of a customer, the likelihood of attrition and more before spending money on loyalty offers
  • Make channel choice decisions explicit when prioritizing a channel for communicating with customers e.g. in a loyalty campaign
  • Make the decision to try and reactive a customer explicit so that effort is spent reactivating the kinds of customers you need now - don’t try and reactivate Christmas-only shoppers in Spring, wait until Fall
  • Make the decision on what to offer someone to reactivate based on what has worked for people like them in the past, the cost of offers, the expected value of reactivation and more.
  • Make sure the reactivation program uses up to date information so you don’t make offers to people who just reactivated themselves

and so on. Two previous posts seem particularly relevant. This one on using decision management to build loyalty and grow and this one on using EDM to keep loyalty where you want it - with the company not an individual employee.

ShareThis


Posted September 16, 2008 11:39 PM
Permalink | No Comments |
Copyright © 2008 James Taylor. Visit the original article at More on standards - Rule Interchange Format.

Continuing on the theme of standards, several working drafts specifications have been recently published by Rule Interchange Format (RIF) working group of the W3C for public comment:

  • The specification of RIF basic logic dialect (RIF-BLD) is in its “Last Call” public comment period. This is the time for people to read it and tell us, the working group, about anything that doesn’t seem right. After this, if you don’t like something in the spec, it will be increasingly hard to get it changed;
  • The RIF-RDF-OWL specification [3], also in its Last Call public comment period, explains how RIF-BLD interacts with RDF and OWL;

The other working drafts that were published for public comment are:

Comments are requested by September 19 in order to consider them for the next set of revisions.

ShareThis


Posted September 16, 2008 3:06 PM
Permalink | No Comments |
Copyright © 2008 James Taylor. Visit the original article at Chief Decision Officer?.

Mitch Betts’ blog brought an interesting article to my attention this week - an interview Accenture chief scientist Kishore Swaminathan in which he argues that CIOs need to move up the value chain and become Chief Intelligence Officers. I kinda like this but I would not equate being a Chief Intelligence Officer with data but with decisions. A CIO should be working to ensure that all the systems in the organization improve decision making. Some will do this by providing the right information and analysis to people to make decisions, some by making better decisions themselves. To get value from its data and organization must make better decisions with it and that should be the role of the CIO.

ShareThis


Posted September 16, 2008 3:51 AM
Permalink | No Comments |
Copyright © 2008 James Taylor. Visit the original article at More support for PMML.

Nice to see support for PMML (Predictive Model Markup Language) continuing to expand with the recent announcement of support from Pentaho. Support for standards like this is important in decision management as a number of products will typically need to be used in combination to build decision management solutions.

ShareThis


Posted September 16, 2008 3:41 AM
Permalink | No Comments |
Copyright © 2008 Neil Raden. Visit the original article at Wither Analytics: An Homage to Hy Minsky.

When it comes to analytics, Wall Street is clearly the leader. The best of the best head there after school to six-figure starting salaries and some even see seven-figures based on their performance. They are the rara avises, the crème-de-la-crème and whenever we speak about “Competing on Analytics,” it goes without saying that Wall Street analytics represent the exemplar of what is possible for an analytic culture.

So why is Wall Street melting down?

Clearly, analytics aren’t everything. Our financial system is pretty complicated and subject to abuse and fraud. The current crisis is aligned with the greed of the mortgage brokers and the mortgage bankers, and once in a while the financial press will point the finger at the hedge funds and investment brokers that shoved mortgage-backed securities down the throats of other investors.

Hmmm.

Wasn’t anyone watching this? After all, interest rates started to creep up a few years ago, the economy started to turn down, default rates started to appear at around the same time. Is it possible that the quant’s were so buried under leveraged layers of derivatives and other exotic instruments that they didn’t see the coming storm? This seems like a pretty big movement to miss. After all, if you’re sitting on top of a few billion in debt that is on the razor’s edge of liquidity, wouldn’t you spend some time looking at it more closely, especially with such broad macroeconomic factors staring you in the face?

Maybe the problem was just that – too much attention to the monetary and business-related factors and not enough attention to the movement of markets on a broader scale.

In the early 70’s, I had the unique opportunity to take economics classes from the legendary (but until recently, obscure) Hy Minsky. Minsky is known for the “financial instability hypothesis,” which proposes economic expansions become unsustainable booms ending in crisis and economic unraveling. Speculation. Greed. Disaster. I first heard the phrase “Chaos Theory” from Minsky thirty-five years ago.

Minsky has suddenly become very popular (unfortunately he passed away in 1996). One of his memorable quotes in class (there were many) was: “All panics, manias and crises of a financial nature, have their roots in an abuse of credit.” He used the Dutch Tulip mania of the 1600’s as an example. He believed that financial systems experience rounds of speculation that, if they are severe, end in crises. Minsky was considered a radical for his stress on their tendency toward excess and upheaval.
He showed that bubbles are an inevitable result of market activity. Buyers who show gains with a successful strategy encourage other buyers until it stops working. When investors have to empty their portfolios of even their prime holdings to cover their positions, markets start to circle the drain.
At that point, the “Minsky moment” is obvious.
And it’s here – Bear Stearns, Countrywide, Lehman Brothers, Merrill Lynch.
So the question is, why didn’t the best and the brightest see it coming? We need to do some soul-searching. Is there really any benefit to advanced analytics and an analytics culture if doesn’t see the train coming through the tunnel? Or is something else? I think that no one wants to believe a “Minsky Moment” is coming.

ShareThis


Posted September 16, 2008 12:08 AM
Permalink | No Comments |
Copyright © 2008 James Taylor. Visit the original article at First Look - Erudine Behaviour Engine.

Erudine is a British company a few years old and has released some new technology in a new process context - the Erudine Behaviour Engine (yes, the British spelling). Like many technologies, Erudine is targeting the business-IT divide, focusing on problems like those of translating requirements into systems, integrating the expertise of lots of people (analysts, designers, developers) and communication. Besides the problems these things cause in building a first version, constant change tends to cause functionality to drop behind requirements steadily over time. This is exacerbated by problems of knowledge retention - through the lifetime of a commercial system knowledge is lost (by retirement or resignation but also by the passage of time) and so must be revamped for each new release at an additional cost. At the heart of this problem is the basic fact that there is lots of knowledge that must be extracted and turned into the new system - legacy code, expertise, policies, regulations etc. Their perspective also is that while writing code is quick, checking it and confirming it is complete and correct is much harder and slower especially when one has to consider the implications and consequences of a chance.

Erudine focus on tacit knowledge (rather than explicit knowledge) and develop the behavior model of an application by looking at real cases and asking those specifying the system to say what the system should do in that case and then justify it. This is test-driven design on steroids - developing business behavior starting from the answer we want and moving to why that is the answer one functional point at a time. This is a very different approach to the more explicit knowledge approach taken in business rules management systems. Some critical facts about Erudine:

  • Each functional point can be an example case, a test case or a real transaction.
  • Development rapidly focuses on the exceptions and exceptions to exceptions - and the consequences of those exceptions
  • Conceptual graphs are used to present data in a visual way
  • This data is described using an ontology

A demonstration of a customs border example showed how some of this worked. The system is designed to help a customs officer decide what action to take in response to a particular person trying to cross a border. In many ways this whole example is a decision service. First design step would be to layout the data flow- specifying how to get data from data sources, take data cleansing, enrichment and integration activities etc. This decision flow, if you will, also handles sequencing of steps and the specification of behavior steps or decisions. The decision node in the example is designed to choose one of four actions - arrest, deny, accept, detain.

Before the “rules” can be specified, an ontology must exist. This can be loaded from OWL if you already have it defined and can be completed during development - you could start with a basic one and then refine it as you worked through cases, for instance. Data can be mapped directly from databases (although more complex entities must be mapped in using Java Hibernate classes). So far this seems like development work but we have not got to the clever bit yet.

Once you reach this point, the decision node can be specified by non-technical users. Business experts can take a list of situations (prior instances from a legacy application, test cases, formal examples or whatever) and then view each one using the conceptual graph. For each instance the user specifies the decision they would take - what their conclusion would be for this instance - and then explain why. This is done in a point-and-click way using the conceptual graph. For instance they might take a record representing a person in this example and say they are allowed in because they have a visa. This creates what they call cornerstone or unit test. Both the structure of the data and values can be used in these rules. This is a powerful approach because it is often much easier for an expert to explain an example and their reasoning than to specify a general rule or requirement.

The expert then goes on to repeat this for subsequent instances. Each subsequent rule must be compliant with all previous cornerstones (unless you wish to change your mind about the rule) - the first case cannot be changed to a different result by the second set of behavior for instance. The editor won’t allow a subsequent condition to contradict a prior one.

The ontology comes into play by allowing the user to generalize a reason. For instance, a person might be arrested rather than allowed in. This person might be carrying Cocaine but the expert knows that Cocaine is an illegal cargo (in the ontology) and so specifies that carrying an illegal cargo gets you arrested. Similarly they might specify that it does not matter that this particular case was a truck and that it could be any vehicle.

As you watch the tool work it seems pretty clear that common cases would be found quickly - the 80/20 rule would play in your favor - and that you would rapidly get all the basic conditions handled. Using Eurdine to clone and replace a legacy system allows you to compare current definitions to logs or results tables. This shows historical entries with differences between Erudine and the current system allowing specification of clarification rules to eliminate them.

The resulting “rules” can be very complex - but specified “by example” remember, so this complexity would not necessarily be visible. Examples of rules might be:

If there is a School with a Child over the Age of 8
- and that Child has a sister in the school below the Age 7
- and the Sister shares a Class with a Boy who has a Grade A average
- and this Boy is in the same sports team as the first child
Then…

If there is a network Node under attack
- and the type of attack is a Denial of Service
- and the attack originates from outside our Secure Network
- and the Node hosts a Technical Service
- and that Technical Service supports a Business Service
- and that Business Service has an SLA level of Critical
- and we have Backup Virtualization Servers available
Then rehost the Technical Service onto a new server

The combination of the ontology and the graphical environment for specifying rules by example allows for complex objects to be manipulated using complex rules.

The tool had another nice feature allowing you to may these cornerstones or rules to a requirements document defined in the tool. As you create cases you can refine the requirements and link it and you can see requirements without tests and vice versa. This combined with the other features has results in some customers using Erudine purely as a behavior or requirements capture tool or to learn the behavior of an existing system with which they are less familiar than they desire.

Erudine behavioral ’services’ are stored in a Knowledge Model (KNM) file that contains all the behavior, ontology and requirements links. Access to resources is through logical links with actual links defined in a config file for deployment. Generally these resources will change through the various staging environments of a project whilst the logical connections do not. This allows the same KNM file to be staged through environments without change. Versioning is usually handled through a standard versioning repository, providing fairly coarse-grained version control (though finer grained control is under development). Debugging is very visual- the path through the data flow model can be examined interactively for a problematic transaction. At each node the behavior rules that fire can be interrogated and even the requirements that the behavior satisfies queried.

I found the product very intriguing and I hope to work with it some more. Check it out at www.erudine.com

ShareThis


Posted September 15, 2008 3:21 PM
Permalink | No Comments |
Copyright © 2008 James Taylor. Visit the original article at White Paper available on decision management in credit.

I wrote an introduction (and a brilliant one if I do say so myself) for a white paper called “Using Automated Decisioning and Business Rules to Improve Real-time Risk Management” that was produced by the folks at Equifax. You can download it using the link to the US Banker page. Enjoy.

ShareThis


Posted September 15, 2008 2:47 PM
Permalink | No Comments |
Copyright © 2008 James Taylor. Visit the original article at Nice article on decision services in SOA.

Eric Roch had a nice post today - SOA Decision Services - in which he references some of our work and our book. He ends with a great quote:

As SOA matures we are finding new ways to architect systems and receiving benefits from SOA in unexpected ways. How often have you seen improvement of operational decisions listed as a SOA benefit?

ShareThis


Posted September 12, 2008 8:44 PM
Permalink | No Comments |
Copyright © 2008 James Taylor. Visit the original article at Speaking in Brussels Evening of October 10th.

Thanks to my friend Jan I will be giving a seminar at SAI - the Belgian ‘Study Center for Information Processing’ - in the evening of October 10th. You can find details here - I am speaking on ‘Decision Services’: A pattern for business rules in Service Oriented Systems and Architectures.

ShareThis


Posted September 12, 2008 8:09 PM
Permalink | No Comments |
Copyright © 2008 James Taylor. Visit the original article at What if someone with a lower pay grade were to do this?.

Patrick Joseph Gauthier wrote a great post this week called “Business Process Reengineering: The Right Skills And Roles For The Task Will Save You Money” and I loved the question he suggests (that gave me the title for this post):

“what if someone with a lower pay grade were to do this?”

He goes on to make me even happier by asserting, quite correctly IMHO

Decisions Are Most Likely Culprit
For my money, I believe 95% of organizations include way too many “Get Supervisor Permission” and “Get Executive to Decide” steps in their processes.

Perfect. This is, indeed, one of the main drivers of decision management. Instead of having the person who is executing the process have to ask someone else for a decision, embed the decision in the process (giving the real decision-maker control over how the decision is made in the process). Instead of having hundreds of front-line staff refer decisions to many managers who follow guidelines taught to them by the one person who understands the company policy, empower the front-line staff to act by having that one person control the rules in a decision and having that decision happen automatically.

I also liked the questions he suggested and think they can be applied to decision management rather than process management with only minor changes:

Ask Yourselves…

  1. what skills are really required for the decision?
  2. what level of training and education are required?
  3. what level of authority is required to make the decision and to define how the decision is made?
  4. how much time does a customer reasonably expect this decision to take?
  5. what value (to the customer) does the decision enable?
  6. what impact does this decision have on quality where the product or service is concerned?
  7. are there any legal requirements imposed on this decision?

Now Ask…

  1. who is making this decision now?
  2. who is executing the process around this decision?
  3. are they the same person? do their qualifications, experience, and place in the organizational structure match the requirements of the task?
  4. based on the time required, how much does this decision actually cost us? (really do this math) what is the upside of a good decision and the downside of a bad one?
  5. what can be done to the decision to either push it “down” or “up” the organizational structure where it might more appropriately belong?
  6. is the cost worthwhile in my customer’s estimation? (really ask this question)

Focus on decisions in your processes - there is lots of room for improvement.

ShareThis


Posted September 12, 2008 1:05 AM
Permalink | No Comments |
Copyright © 2008 James Taylor. Visit the original article at SOA Symposium in Amsterdam - Free Pass.

As I said before I am speaking at the SOA Symposium in Amsterdam in October - “Decision Services: A Pattern for Smarter Service-Oriented Systems“. I think the conference will be great and I hope to meet some readers there but, if you are local, I have an even better offer. I can bring a couple of guests so go ahead and contact me if you would like a pass! james at smarternoughsystems.com

ShareThis


Posted September 11, 2008 6:48 PM
Permalink | No Comments |
Copyright © 2008 James Taylor. Visit the original article at As content on demand grows, so must decision management.

The Conference Board recently announced strong growth in online content or content-on-demand. The press release can be summarized by this comment:

Fundamentally, consumers expect content to be available when they want it, and on the screen of their choice

This, of course, creates both a challenge and an opportunity for those providing content. The challenge is that without a regular schedule they lose control of what their viewers watch - their viewers “touch that dial” whenever they want. Instead of being able to schedule content at times when the target audience is likely to be watching or to put related content together in the schedule they will have to find new ways to keep viewers engaged. The opportunity is that this new environment generates a huge amount of data about who is watching what where and when. This data can drive analytics that give new insight about what content is likely to appeal to which consumers. Combined with adaptive control, these analytics allow content recommendations that will keep consumers engaged and even introduce them to more of the “long tail” of content that is increasingly available. The effective management of this content recommendation decision is critical and only those content providers who do this well will thrive in a content-on-demand world.

One company working on this is ThinkAnalytics. I wrote about ThinkAnalytics before and you can find more information on their content recommendation engine on their site.

ShareThis


Posted September 10, 2008 6:41 PM
Permalink | No Comments |

Posted September 10, 2008 6:04 PM
Permalink | No Comments |
Copyright © 2008 James Taylor. Visit the original article at Predictive Marketing (Lessons from the CMO Summit #3).

Stephan Chase of Marriott generated the third set of thoughts. He is working to make Marriott more customer-centric, in particular by employing predictive modeling to determine what customers are likely to do in the future while using results in marketing to create a learning organization. This is of course the heart and soul of decision management - improving customer decisions by integrating predictions and then continuously improving. He made a number of great points on which I have comments:

  • Many organizations, he notes, start with web analytics as this is where the data are.
    But the value of adding non-web data to this, to analyze a more complete customer profile, is huge.
  • Data is a new form of creative
    Great phrase - data, or at least the information and predictions derived from it, influence your customers just like your creative does. The use of data to deliver extreme personalization is critical.
  • Discovering and applying truth about a customer to achieve mutual benefit
    If you are using all your data only to improve your profitability, not to improve the experience of your customers, they will be unwilling to share information with you. If you can make it a two-way street - mutual benefit - they are more likely to answer your questions and help you help them.
  • To make it happen, find and activate Analyst, Systematizer, Executor and Integrator
    Decision management, of course, requires the analyst to work with someone who knows how to put analytics into systems (a systematizer) as well as business owners who know how to integrate this into their business. The system itself becomes the executor.
  • They have a focus on “Customer propensity” e.g. to visit NY, to travel internationally, etc.
    Getting to the point where people say “I want propensities” before making a decision is critical.
  • Predictive marketing is the objective - I want to be able to have some idea of what customers do in the future
    Marketing to the needs a customer has not yet met is likely to be more successful than to the needs that used to have but have already met.

Marriott sounds like it is on its way to decision management and I look forward to hearing more about their journey.

ShareThis


Posted September 9, 2008 3:16 PM
Permalink | No Comments |
Copyright © 2008 James Taylor. Visit the original article at Multi-Channel Marketing (Lessons from the CMO Summit #2).

My second set of thoughts were prompted by notes on a presentation by the CMO of Walmart.com, Cathy Halligan. She began by noting that they no longer see a digital divide  - there is a big percentage overlap between their online and offline shoppers.In addition, online activities are increasingly influencing offline purchase patterns and so you must think in a multi-channel way. For instance, product reviews are the #1 request by customers and their availability on walmart.com has gotten a very positive response. Buyers are using this user generated content both to shop online but also as part of how their offline purchase decisions are influenced. I have blogged before about multi-channel customer experience and about how decision management complements web 2.0 and social media so this post is really a set of links, not a set of thoughts…

ShareThis


Posted September 9, 2008 12:16 AM
Permalink | No Comments |
Copyright © 2008 James Taylor. Visit the original article at Getting to Analytic Decisions (Lessons from the CMO Summit #1).

A colleague attended the Aberdeen CMO Summit last week and took some great notes. I am going to have a couple of posts this week based on her notes. First up, some lessons from Paul DePodesta (of the Padres). Paul focused on some of the challenges of moving from judgmental to more analytic decision making. While his focus is on manual decision making and how analytics can improve it, there are direct parallels when thinking about how to make your marketing systems more analytic. For instance, Paul identified some psychological barriers that come up when trying to change the way people make decisions. These are also challenges when trying to replace a manual decision with a systematic one, whether analytic or not. Here they are with some comments:

  1. Desire for acceptance
  2. Emotional attachment
    People become very emotionally attached to having people make decisions so this is particularly problematic when trying to move to systematic decision management.
  3. Tradition
    Analytic decisions are often counter-intuitive and overturn traditional views of what works and what does not.
  4. A focus on recent outcomes versus long term trends
    This partly explains why analytic decisions are counter-intuitive as the use of analytics tends to focus on trends not recent outcomes.
  5. Affirmation bias reinforce own beliefs
  6. A “too hard” pile
    His focus was that some discussions that need to happen don’t (because they are too hard) but it seemed to me that there is also an opportunity here. Perhaps some things that were thought to be “too hard” can now be figured out with analytics, like considering retention risk or long term profitability in the loyalty program.
  7. Physical appearance ( 3.9% of population over 6′2″, 30% of F500 CEO’s)
    One of the attractions of analytics is that they don’t care what you look like. This is one of the reasons they work better than judgmental decisions - they are not carried away by visual impressions unlike (say) a store clerk.

Paul recommends, as I would, asking “if we were starting again, how would we do this?” as this can often unlock some of the built in biases. He also emphasized the power of asking naive questions and of continuing to ask naïve questions for similar reasons. He suggested humility in the face of uncertainty and urged his audience to recognize the uncertainty. I would go further and suggest that uncertainty creates opportunity if you can use analytics to turn uncertainty into probability.

Paul talked about analyzing the “most valuable situations” and asking “what put us in those most valuable situations”. This is a great way to turn existing manual decision making into ideas for decision management. Take a call with a very successful cross-sell, say, and see what put you in the position to make that cross-sell. Can you replicate it systematically to maximize your chances of a similar success in other calls? He also suggested that you keep a decision diary so you can judge decisions in the time and circumstances not in 20/20 hindsight. This, of course, is fundamental when using analytic models as it is vital to consider the information available at the time a decision was made, not the information available at the time the results of the decision become apparent!

Lastly he emphasized a focus on having a winning process - as he said “Be the house” so that you win on average, even though you don’t win every time. This is exactly the attitude you need to adopt decision management. Good decision management will make you the house - your decisions will, on average, be winners. More from the CMO summit tomorrow.

ShareThis


Posted September 9, 2008 12:00 AM
Permalink | No Comments |

Posted September 5, 2008 11:36 PM
Permalink | No Comments |
Copyright © 2008 James Taylor. Visit the original article at Decisions matter to Complex Event Processing.

An old colleague asked me to explain a little about the difference between Complex Event Processing or CEP and decision management. In particular he referenced a recent series of articles by James Kobelius in which the last one (titled Really Happy in Real Time) discussed how “Complex event processing empowers the contact center to manage the customer relationship”.

Interestingly enough this whole topic - of rules, decision management, analytics and CEP - has been going on in the blogosphre recently. Check out theses posts by me on my ebizQ blog as well as related ones elsewhere:

I am not going to repeat the whole discussion here but suffice it to say that Complex Event Processing involves Complex Event Detection/Correlation, Decision Management and Process Execution. Thus a CEP product may well have many, if not all, of the capabilities you need for decision management. Similarly, the difference between a decision management platform and a CEP one is simply the degree to which the underlying platform “understands” events and has capabilities to make it easy to detect and correlate them.

Me, I think that few companies will find that ALL decisions are part and parcel of CEP applications (any more than they will find that all decisions are tied to business processes) and will find it useful therefore to consider a CEP environment that handles decisions well, a Business Process Management environrment that does likewise and a core decision management environment. This might all use the same rules engine or analytic execution engine but from an enterprise architecture point of view it is important to think of them separately.

ShareThis


Posted September 5, 2008 11:36 PM
Permalink | No Comments |
Copyright © 2008 James Taylor. Visit the original article at Great event, nice little discount.

There is a great conference coming up October 26-30 - not only are Neil and I Co-Chairs but readers of the blog can get a discount. We are presenting twice - A Pre-Conference Tutorial Succeeding as a Decision-Centric Organization and a Keynote Competing on Decisions. Because of this you can get a special $100 Conference Discount courtesy of Smart (enough) Systems LLC if you use code “GVDJT” when you register
See details below


“I have concluded that decision making and the techniques and technologies to support and automate it will be the next competitive battleground for organizations. Those who are using business rules, data mining, analytics and optimization today are the shock troops of this next wave of business innovation” Tom Davenport
Tom Davenport,
Author of Competing on Analytics: The New Science of Winning
Hear a special video keynote and roundtable discussion with Tom Davenport exclusively at this year’s Forum.

About the Business Rules Forum & 1st EDM Summit
Have a look at this year’s program. Find out what the excitement is all about!

Conference Schedule

Get the whole scoop on this year’s Conferences … Download a copy of our new 20-page Conference Brochure featuring complete details of this year’s unparalleled event.
Or check out the information online

Here are the details of our Pre-Conference Tutorial …
Succeeding as a Decision-Centric Organization
When: Sunday, October 26, 2008
Topic:
The principles of Enterprise Decision Management, its application to critical business processes and decisions and the appropriate use of available technology.
Highlights:

  • How to identify and prioritize the operational decisions that drive your organization’s success
  • How to use business rules as a foundation to automate these decisions for maximum agility
  • How to improve these decisions using data mining and predictive analytics
  • How to ensure continuous improvement and competitive advantage using adaptive control

And our Keynote …
Competing on Decisions
When: Wednesday, October 29, 2008
Topic: Neil Raden and James Taylor introduce a new competitive concept – Competing on Decisions. Thanks are due to Tom Davenport for raising the awareness of the need for analytics in his book, “Competing on Analytics.” Seeking out the increasingly small margins required by competitive business pressures has brought analytics into vogue, typically requiring vast amounts of digestible data .Getting their arms around this deluge of new and existing information, organizations realize that they can gain analytic insights customers, products, channels, partners and much more. But some companies are already finding that analytics is only a part of the process – the intelligent application of the findings of these new insights can only pay off if the decisions that are made are correct. By becoming decision-centric, by using business rules to control those decisions and by leveraging their data to make the best decisions companies are increasingly competing on decisions.
Highlights:

  • What is a decision in a business context?
  • Why decisions matter
  • Why decisions are different
  • How business rules control decisions

Other Keynotes at this Year’s Events include Tom Davenport, author of the bestselling book Competing on Analytics and Professor at Babson College, will deliver an exclusive video keynote on “Decision Making, Decision Management and Technology.” and Ron Ross, author of numerous definitive books on business rules, will deliver the opening keynote “From Here to Agility”.

Besides these great keynotes, this year’s program is truly outstanding with 60+ rich, in-depth sessions to help you bring real agility to your organization.

The 2008 Program Schedule offers world-class sessions on business rules, enterprise decisioning and related technologies. Check out the schedule, speakers and abstracts!

To Register and Receive Your Discount

When you register, be sure to include the promotional code - GVDJT - getting you $100 off registration.

discount

See you there…

ShareThis


Posted September 4, 2008 1:44 AM
Permalink | No Comments |
Copyright © 2008 James Taylor. Visit the original article at The role of decision management in creating (and maintaining) a common vision.

An interesting article on the role of the business analyst in creating a common vision caught my eye this morning. The article focused on creating a common vision but it made me think about maintaining and developing that common vision over time, particularly of the complex logic in a system. Procedural code does not lend itself to business user understanding and I am not convinced there is that much a business analyst can do to help. If, however, the complex logic is externalized as a decision and that decision is managed declaratively (using business rules, say) then the business analyst (and the business user) have a viable point of communication with the programmers. Whether the non-technical users maintain the rules directly or collaborate with programmers to make the changes they need, the separation of business logic from “plumbing” code and the use of a declarative, higher-level syntax mean they will be much more likely to maintain a common vision of the system’s behavior. As I have said before, we could call this application development 2.0.

ShareThis


Posted September 3, 2008 4:38 PM
Permalink | No Comments |
Copyright © 2008 James Taylor. Visit the original article at Analytics simplify data to amplify its value.

Analytics simplify data to amplify its value

This was a phrase I remember from my friends in the Fair Isaac R&D team. I have no idea if this is original or a well-known analytic quote but I like it. Think about it, most business users would say they want usable, actionable information not just data so analytics “amplifies value” by replacing large amounts of data with a statement encapsulating what that data implies. After all, better data adds no value to a company only better decisions thanks to better data do.

ShareThis


Posted September 3, 2008 12:57 AM
Permalink | No Comments |