We use cookies and other similar technologies (Cookies) to enhance your experience and to provide you with relevant content and ads. By using our website, you are agreeing to the use of Cookies. You can change your settings at any time. Cookie Policy.


Blog: James Taylor Subscribe to this blog's RSS feed!

James Taylor

I will use this blog to discuss business challenges and how technologies like analytics, optimization and business rules can meet those challenges.

About the author >

James is the CEO of Decision Management Solutions and works with clients to automate and improve the decisions underpinning their business. James is the leading expert in decision management and a passionate advocate of decisioning technologies – business rules, predictive analytics and data mining. James helps companies develop smarter and more agile processes and systems and has more than 20 years of experience developing software and solutions for clients. He has led decision management efforts for leading companies in insurance, banking, health management and telecommunications. James is a regular keynote speaker and trainer and he wrote Smart (Enough) Systems (Prentice Hall, 2007) with Neil Raden. James is a faculty member of the International Institute for Analytics.

January 2009 Archives

With the business world in a state of flux and everyone worried about what might happen next, and how they might respond to it, scenario testing (and its compatriot, stress testing) should be top of mind for executives. They should be thinking about different scenarios, testing out how those scenarios would effect their business and trying out various alternatives. On the risk side they should be using this kind of scenario planning to stress their assumptions - stress testing - to see how their financial reserves would cope with the various alternatives. For too many executives, however, this kind of testing is done only at the aggregate level and done largely (if not completely) in Excel. I have nothing against Excel but this is clearly not really acceptable. Good scenario or stress testing should consider how customers, products, suppliers, locations will be impacted by the scenario at a granular level and then present rolled-up results, not simply attempt to model some averages or totals. Similarly, if executives want to develop alternative scenarios that would be effective in certain possible futures then they need to test those scenarios against actual transactions, actual customers, to see if they work. Companies that have adopted decision management have the infrastructure to manage this. Decision management brings the crucial decisions - choices of actions - into the open and makes them explicit. Scenarios can be developed for these decisions and tested against real data. The results can be compared against what happened, or against alternative scenarios to see what would work best. Different assumptions can easily be fed into the decisions to see what impact those assumptions have and stress testing or scenario development conducted based on the results. Decision management makes all this possible. It's still work, but it is much less work and the results can be much more precise and grounded in real decisions. A growth in scenario management was one of my predictions for 2009 and Jim Sinur wrote a nice piece on this too - Scenario Planning is No Longer Optional.

Posted January 15, 2009 3:25 PM
Permalink | No Comments |
Copyright © 2009 James Taylor. Visit the original article at First Look - Mobile Agent Technologies.

Mobile Agent Technologies ( www.agentos.net) is an early stage start-up offering an integrated platform for decision automation- Einstein Enterprise. This combines and integrates various technologies typically sold separately, like business rules and analytics, and is intended as a horizontal product for the automation and management of decisions.

Einstein Enterprise is Java-based and combines open source and proprietary components and a couple of the key pieces are either patented (cloud computing) or closely held as trade secrets such as Common Sense Reasoning™. The product supports:

  • Cloud computing
    It runs on any of the machines a company has (the internal cloud, not the amazon/google cloud)
  • Business rules
  • Data mining and predictive analytics
  • Complex Event Processing
  • Content Management System
  • Relational Database

The product is new and the company has a small number of Fortune 500 customers evaluating and developing with the product. Uses range across the decision management market from marketing and customer acquisition, such as a product recommendation engine, to underwriting, medical diagnosis and treatment and fleet logistics. The 1.0 product is not yet aimed at pure business users (in terms of rules or data mining) but it is already beyond the geeks-only level and the team is working on developing the kind of business user friendly interfaces needed.

I have not had a chance to look at it in detail but it sounds like a really interesting pure-play decision management platform and I look forward to seeing more of it in the future.


Posted January 14, 2009 8:36 PM
Permalink | No Comments |
Copyright © 2009 James Taylor. Visit the original article at Hardcoding + procedural code = bad news.

In a blog post about Hardcoding Considered Harmful - or is it? Jeff Palermo said

Oren Eini boldly makes the assertion that a system is simpler to maintain when configuration is hard-coded in one place within the system. Coupled with an automated testing and deployment process, changing configuration can be just as simple and predictable as possible. Oren asserts that hard-coding as much as possible enhances maintainability. He then defends his position with an example in a subsequent post.

In the example post Oren has this code snippet:

public class DiscountPreferredMembers : OnOrderSubmittal
{
	public override void Execute()
	{
		if ( User.IsPreferred )
			Order.AddDiscountPrecentage(5);
	}
}

He goes on to say:

We hard code the rules, which is the easiest approach to getting things done. Because we have a structured way of doing that, we can now apply the ways in which we use it. For instance, if we wanted to support runtime replacements of rules, that is easy, all we need to do is to provide a FileSystemWatcher and reload them. If we want to add a rule, we just create a new class for that. If we want to modify a rule, we go and change the hard coded logic.

Wow - the things a programmer will describe as simple! What if this had been hard-coded in a Business Rules Management System or BRMS?

  • It would still be hard-coded and easy to write.
  • It would be deployed as a simple to use, structured component just as Oren describes.
  • If we wanted to support runtime replacement of the rule we would have to do nothing (the BRMS already handles that).
  • If we wanted to add a rule we would just add a rule - no need to create another class.
  • Modification of the rule would be about the same.

The differences become more extreme as the complexity of the decision we are talking about increases. If there were 5 or 10 or 100 rules that had to be applied, as there often are in discount scenarios. Now we would end up with either lots of classes and/or methods and nested if..thens. Not with business rules - just a simple, flat list of rules that are easy to read, easy to manage and easy to change. The problem with this example is not so much that it has been hard coded as that it has been hard-coded in a language that is, to quote Ira Fuch

…syntactic, abbreviated, and procedural, as opposed to semantic, verbose, and declarative

Hard-coding in business rules would dramatically improve the approach Oren proposes. Indeed many BRMS customers do exactly this - they use the BRMS and the business rules syntax but they continue to hard code the rules behind their decisions - developers just write rules, not code pretending to be rules. Now Jeff goes on to say:

I agree with Oren.  The first draft of some functionality should hard-code everything.  Then subsequent revisions will cause some information to be factored out into mediums that can be maintained while suppporting all the requirements in scope.  The requirement cause us to make decisions about what information to hard-code and which information to soft-code.

I understand the point Jeff and Oren are making here - soft-coding things can make them more complex than is necessary and can, especially if done thoughtlessly, create maintenance problems. However hard-coding in traditional procedural languages like Java or C# means creating a procedural and abbreviated piece of code that will become increasingly hard to maintain as its complexity increases and that business users will not find approachable or easy to understand. Hard-coding in business rules avoids this problem and, when something does need to be “soft-coded”, a BRMS allows the easy transition to business user rules management using template-driven or other approaches layered on to the same underlying rules execution and management environment.

Hard-coding may not be harmful absolutely all the time but coding business rules in code probably is.


Posted January 14, 2009 12:50 AM
Permalink | No Comments |
Copyright © 2009 James Taylor. Visit the original article at Decision Sciences in Healthcare - Academic Research Request.

A client of mine is looking to align themselves with a university and get some published research on using decision science in healthcare communications. Anyone know people or universities that might be interested? Let me know - james at jtonedm.com


Posted January 13, 2009 5:34 PM
Permalink | No Comments |
Copyright © 2009 James Taylor. Visit the original article at Transforming retail with analytics and decision management.

Tom Davenport has done some research into analytics and retail (reported here: Retailers recognise analytics as key to business transformation. Here’s a quote from the new item:

Retailers today are searching for ways to derive more customer intelligence, marketing savvy and operational insight from their overflowing databases. In addition to acknowledging that the use of analytics is the key to future success in this data-intensive industry, retail executives also revealed to Davenport that:
Analytics improve retailers’ bottom lines most quickly when applied to pricing and merchandising.
Analytics drive market differentiation and customer-centric marketing.
Analytics help retailers achieve demand-driven supply chain optimisation (sic).

What I find interesting about this is that all these examples of the power of analytics involve things that are becoming more dynamic. So:

  • Price optimization means dynamically making the most effective price (balancing profit, acceptance rates, supply etc) one customer at a time
  • Analytic merchandising increasingly means making merchandising decisions store by store, month by month or even day by day
  • With most retailers now being multi-channel, marketing and customer-centricity must be extended to the web and other channels
  • Supply chain optimization is becoming more dynamic, combining optimization and business rules, rather than the subject of occasional planning exercises

So the value of analytics is to be found when analytics are injected into operational processes - the essence of decision management. Increasingly retailers, like other users of analytics, will find themselves thinking about decision management based on analytics not just on analytics.

I have a few posts on Decision Management in Retail here and I expect I will have more over the coming months.


Posted January 13, 2009 5:01 AM
Permalink | No Comments |