We use cookies and other similar technologies (Cookies) to enhance your experience and to provide you with relevant content and ads. By using our website, you are agreeing to the use of Cookies. You can change your settings at any time. Cookie Policy.


Blog: James Taylor Subscribe to this blog's RSS feed!

James Taylor

I will use this blog to discuss business challenges and how technologies like analytics, optimization and business rules can meet those challenges.

About the author >

James is the CEO of Decision Management Solutions and works with clients to automate and improve the decisions underpinning their business. James is the leading expert in decision management and a passionate advocate of decisioning technologies business rules, predictive analytics and data mining. James helps companies develop smarter and more agile processes and systems and has more than 20 years of experience developing software and solutions for clients. He has led decision management efforts for leading companies in insurance, banking, health management and telecommunications. James is a regular keynote speaker and trainer and he wrote Smart (Enough) Systems (Prentice Hall, 2007) with Neil Raden. James is a faculty member of the International Institute for Analytics.

Mike Gualtieri published a nice piece on business rules engine algorithms last July that I wanted to point out to my readers. Mike summarizes the mainstream rules engine algorithms into those that deliver inferencing at run time, those that execute sequentially and those that execute sequentially but have compile-time algorithms to sequence rules correctly. While I have a few comments on Mike's report, I was struck both by its measured tone and a great piece of advice:
Let Authoring Flexibility Drive Your Algorithm Decision
This is key. The extent to which the tool allows you to write authors the way you need to write them, the way your business users need to write them, is what matters. It is the flexibility and agility that business rules give you that is the primary value driver. Pick your vendor based on how the rule editing and management environment will work for you. The capabilities of the vendor's algorithm(s) will impact this but they are just part of the puzzle - the kind of editing and management environment will matter more. Most of the major rule vendors will do a good job on performance, if you use the tools the way they are intended and don't try and force-fit your previous programming experience too much. If you are interested in this topic, buy the report (it's a good one). I would just add a couple of things:
  • I think he under-calls the potential for inferencing engines to run faster than sequential when a very large number of rules exist but where each transaction only fires a tiny percentage (common in regulatory compliance) for instance
  • Some vendors allow different algorithms to be used in different steps in a decision, a useful feature
  • I have never found a Rete user who had trouble recreating a bug. The data in a transaction determines the sequence of execution of rules and the same data/ransaction will reliably drive the same sequence of execution. Sure different data results in a different order of execution but that does not have any impact on recreating a bug
  • I think the ability to integrate predictive analytics with business rules is already bringing new algorithms to bear. A decision tree built using a genetic algorithm might execute the same way any other decision tree does but it shows the results of the new algorithm just the same.

Posted January 9, 2009 11:20 PM
Permalink | No Comments |