Blog: James Taylor Subscribe to this blog's RSS feed!

James Taylor

I will use this blog to discuss business challenges and how technologies like analytics, optimization and business rules can meet those challenges.

About the author >

James is the CEO of Decision Management Solutions and works with clients to automate and improve the decisions underpinning their business. James is the leading expert in decision management and a passionate advocate of decisioning technologies – business rules, predictive analytics and data mining. James helps companies develop smarter and more agile processes and systems and has more than 20 years of experience developing software and solutions for clients. He has led decision management efforts for leading companies in insurance, banking, health management and telecommunications. James is a regular keynote speaker and trainer and he wrote Smart (Enough) Systems (Prentice Hall, 2007) with Neil Raden. James is a faculty member of the International Institute for Analytics.

June 2008 Archives

Copyright © 2008 James Taylor. Visit the original article at Book Review by Ade McCormack.

Ade McCormack, author of The IT Value Stack (reviewed previously) and columnist at the FT in England, just reviewed the book for this blog. You can read his review of Smart (Enough) Systems over on his blog.

ShareThis


Posted June 30, 2008 8:59 PM
Permalink | No Comments |
Copyright © 2008 James Taylor. Visit the original article at First Look - Corticon.

I got a walkthrough of Corticon’s Business Rules product last week - the first time I have discussed it in a while. Version 5 has some interesting features. The Business Rules Foundation is a set of headless services, designed to support a variety of development tools, UI frameworks and metaphors along with a variety of persistence mechanisms. On top of this Corticon ships its own Rule Studio, built in Eclipse, and IDS Scheer ships its ARIS Rule Designer (allowing rules to be specified in parallel with process models and stored together in a single repository). The core engine uses code generation, currently to Java or .Net . Finally there is a data integration layer to integrate external data so that it can be accessed as needed by rules as they execute rather than having to be marshaled for a call to a decision service.

The tool is aimed at less technical users and this is something Corticon certainly works on. I have my doubts about Eclipse as a business-friendly tool but Corticon does offer a fat client version that is more streamlined. Users define a vocabulary (either from scratch or by importing XMI, UML, XSD or Java interfaces) and then specify constraints on attributes, map that to the inbound data or to external data sources and define rules against the vocabulary. Rules are defined in rule sheets (similar to what most vendors call rule sets) and these are linked together into ruleflows - what I might call a decision flow. The rule sheets are designed to make it easy to define the values different attributes must have for a rule to be true (using a table metaphor) and can then be linked to rule statements (something like reason codes) with a rich set of meta data related to outcomes. Corticon’s approach is sometimes called a Decision Table but there approach is different than mot others - less compact but able to define more complex rule logic in the table. Rule conditions trigger actions, such as setting attributes to specific values. The ruleflow allows sequencing of these rule sets as well as some iteration and branching – the iteration seems mostly needed to compensate for the lack of a Rete engine to automatically re-fire rules but could also be used to iterate more generally.

Validating that rules are complete and consistent has always been something Corticon has handled well. While a number of the vendors have closed the gap recently, the table metaphor Corticon uses works in their favor here and this remains a strong feature. They have added some nice testing facilities, allowing input, output and expected data to be compared. This kind of support for regression testing has become widespread in the rules industry in recent years and it is nice to see it here also.

Corticon prides itself on being “model driven” and on not needing much programming support. One area this shows up is in the data connectors which do not require any SQL writing, for instance, instead generating it from the model (vocabulary) and database schema. Obviously this is easier for non-technical users but it does mean that specific stored procedures or SQL could not be used to access a database and this could be a little limiting.

Corticon has also started expanding into application frameworks with one aimed at automated customer acquisition (primarily in Insurance).

ShareThis


Posted June 30, 2008 7:05 PM
Permalink | No Comments |
Copyright © 2008 James Taylor. Visit the original article at First Look - Visual Numerics.

Visual Numerics is a 100 person, privately held company that’s been around for a while - nearly 40 years - and yet is largely under the radar thanks to the size of other “analytics” companies. As the business world moves from BI to analytics it is sometimes finding that BI tools are really focused on reporting and OLAP. VNI offers embeddable libraries for advanced math and visualization. Its focus on embedding, rather than on being an application, is not unique among analytics companies but it is not common either.

VNI offers a wide range of algorithms and one of its key value propositions is supporting multiple platforms and languages. Their routines work everywhere, and work the same everywhere, which can be a big deal for customers. In particular, they offer common routines for customers seeking both Java and .Net/C# options. They help companies get a faster time to market for analytics, offer platform consistency, deliver efficient algorithms and can offer execution at source e.g. in database.

VNI has been focused in science and research historically but now more and more business focused including both direct customers such as Humana and OEMs such as SAP and Teradata. Historically lots of work in flight testing, space, pollution, oil and gas, weather forecast etc. Now they do more “normal” predictive analytics and optimization. Today they say they are often finding customers who have used “first generation” capabilities and now need to move to newer and more powerful (or more specific) techniques e.g. in forecasting. VNI’s willingness to do consulting to build custom algorithms, visualization applications and other software  etc is part of their pitch. Example applications include product supply optimization, portfolio analysis, visualization, forecasting, network traffic and other similar high performance analytic scenarios.

Their OEM business is clearly their current focus, and their embeddability is going to be a big advantage there. For instance:

  • Teradata is embedding the C based analytics in the database engine to offer in database analytics as user defined functions (see below)
  • NextSigma is using their .NET algorithms for Monte Carlo simulations in support of Six Sigma
  • Acision, a company providing mobile messaging infrastructure, is  developing a new application based on their algorithms to show deliver marketing analytics derived from the text message traffic to their telco customers
  • SAP Netweaver is using their code to improve text search (initially).

They offer two main products - IMSL numerical libraries and PV WAVE visual data analysis. Their Java and .Net products are adding 2D/3D charting linked to the algorithms and overall data mining, predictive analytics, optimization and simulation are the focus areas going forward.

VNI clearly competes at some level with the likes of SAS and SPSS but VNI aims to offer algorithms that are more embeddable and more cross-platform without the overhead of a complete application. They also compete with some of the easier to embed analytic application products like KXEN or Matlab but seem to offer a broader array of algorithms as well as a smaller footprint.

Teradata, a Visual Numerics customer, recently announced the availability of Warehouse Miner 5.2. This offers data mining functions embedded in the Teradata database. A major component of Warehouse Miner 5.2 is the Teradata Statistical Library- a collection of functions based on Visual Numerics’ IMSL Libraries . With Warehouse miner users can run data mining functions against detailed data in the warehouse without manually coding or moving data between systems.

ShareThis


Posted June 27, 2008 9:53 PM
Permalink | No Comments |
Copyright © 2008 James Taylor. Visit the original article at First Look - OutSystems Agile Platform.

OutSystems came to my attention at the Forrester IT Forum as they were suggested as a tool with good support for what Forrester calls Dynamic Business Applications. Founded in 2001 they have 100+ customers mostly in Portugal and the Netherlands but increasingly also in the US. Of these they identify 17 existing customers that have a “software factory” built around OutSystems’ Agile method and platform.

OutSystems’ focus is on web business solutions with fast time to market for custom applications that are adaptable, controlled and manageable. They focus on reducing the TCO of custom systems and on change-centric markets. Half their customers are SAP shops and most are building composite applications. They like to differentiate between the stable business elements (well supported by existing ERP, CRM and custom applications) and fast, continuous business change where customers need rapid development and change of a composite application based on the web. Five key areas are supported by the product called the Agile Platform:

  • Integrate - An Integration Studio allows you to make code or legacy systems into services consumable by the rest of the OutSystems platform. It supports introspection for existing services etc.
  • Assemble - Service Studio is a visual IDE for laying out business logic, flow, data structures, calls to services etc
  • Deploy - Service Studio and Service Center are used to define where an application will run, to deploy it and to do QA (error and code checking, generation, compilation and deployment).
  • Manage - Agile Manager and Sizing and Scoping tools allow you to log, and monitor deployed applications and their consumed services.  In addition Agile project management capabilities are built into the platform.
  • Change - the system allows you to capture user feedback from the running application (with their Embedded Change Technology) - users click on a hotspot to get a pop up window where they can say what’s change is required - and this feedback is captured as a screen shot and then managed as part of the ongoing Agile project management capability.

Within the Assemble piece they support if-then-else, calls to code, iteration etc. There is not much business process support and no support for declarative decision services (except in so far as you could integrate any decision service built with a Business Rules Management System). The environment is designed to be collaborative and is model driven allowing a certain amount of self healing as things are changed.

It’s an interesting development tool - reminds me somewhat of what CASE tools SHOULD have been when I worked on them back in the late 80s/early 90s. I do think that to be a platform for dynamic business applications they need to provide some tools that allow business users to collaborate more easily around decisions and processes and I am lookin forward to see what they do.

ShareThis


Posted June 27, 2008 2:43 PM
Permalink | No Comments |
Copyright © 2008 James Taylor. Visit the original article at How to address decision making challenges - optimism.

Optimism in one characteristic that it might see harsh to criticize. But take a look at this article on Accounting for the future. It makes a couple of interesting points. Firstly that the preparation of projections can be misleading and that an “inside view” - caused by developing a detailed plan, say - makes you more optimistic because you are less focused on outside events. People, it seems, are better at being optimistic and more data, more time to do analysis can make this worse, not better.

One could imagine that this would create problems with staff and customers being overly optimistic about when they might complete something (handling a refund, say), when they might be able to afford something (like paying off a debt), how likely they are to remember something (like a prescription). Rather than giving them information and hoping they make a good (and not too optimistic) decision, we could automate the decision and do so in a way that uses the data pragmatically. These decisions would then not be optimistic but realistic.

Of course one can be guilty of optimism hen designing decision management systems but I think this would be easier to control as it happens less often and in a more controlled environment.

ShareThis


Posted June 26, 2008 7:12 PM
Permalink | No Comments |
PREV 1 2 3 4 5 6 7 8