We use cookies and other similar technologies (Cookies) to enhance your experience and to provide you with relevant content and ads. By using our website, you are agreeing to the use of Cookies. You can change your settings at any time. Cookie Policy.

Blog: James Taylor Subscribe to this blog's RSS feed!

James Taylor

I will use this blog to discuss business challenges and how technologies like analytics, optimization and business rules can meet those challenges.

About the author >

James is the CEO of Decision Management Solutions and works with clients to automate and improve the decisions underpinning their business. James is the leading expert in decision management and a passionate advocate of decisioning technologies – business rules, predictive analytics and data mining. James helps companies develop smarter and more agile processes and systems and has more than 20 years of experience developing software and solutions for clients. He has led decision management efforts for leading companies in insurance, banking, health management and telecommunications. James is a regular keynote speaker and trainer and he wrote Smart (Enough) Systems (Prentice Hall, 2007) with Neil Raden. James is a faculty member of the International Institute for Analytics.

June 2009 Archives

Timo Elliot had an interesting post Gartner on Collaborative Decision Making in which he discussed a report from Gartner called The Rise of Collaborative Decision Making (and thanks to Nic Smith of Microsoft for the link). This kind of ad-hoc, collaborative decision making is critical in companies and technology to support it is thin on the ground. In fact this was the topic of an Andrew McAfee post, The Diminishment of Don Draper in which he made the point that unsupported, gut, expert or oracular decisions have some serious limitations:

  • Opaque - you can't explain them
  • Not amendable - they are take it or leave it propositions
  • Not disconfirmable - there's no explanation of how they are made that can be analyzed
  • Not revisited - because there is no way to "edit" them
This prompts two thoughts. The first is that the kind of technology David Ullman has been working on (Accord, reviewed here) is worth considering for this kind of collaborative decision making. A decision supported by Accord would be transparent - you could see why you decided the way you did - as well as editable over time so that new data, or new options, could be integrated and evaluated.

The second is that these same characteristics are true of decision made by your front line staff when they interact with your customers. They often can't explain them, not that anyone really asks, and they tend not to be amendable because the customer moves on afterward, happy or not. There is no way to analyze the thought process of these staff and so no way to revisit them to devise a better approach in the future. And these issues are more serious because we are not talking about the executive team (complete with lots of experience, assistants and analysts, deep business understanding etc) but about your least experienced, lowest paid staff.

In this second scenario one effective approach is to use Decision Management to put the decision making into your systems - into Decision Services that support your systems to be precise. Embedding the policies, regulations and best practices that you want applied as rules and using the data you have to drive analytic models with simply outputs (scores, for instance) gives better decisions to the front-line while ensuring transparency and an ability to analyze your decisions and learn what works so that you can constantly improve.

So figure out how to help your executives and managers collaborate around decisions effectively and use Decision Management to ensure you know what's going on at the front line.

Posted June 25, 2009 5:00 PM
Permalink | 1 Comment |

The International Business Rules Forumâ„¢ is the premier Conference dedicated to Business Rules, where the future of Business Rules, Decisioning, Compliance & Enterprise Design is taking shape! This year's event covers Business Rules, Decision Management, Business Process, Governance and Compliance. Once again I will be giving a tutorial and a keynote and acting as track chair for the great decisioning track so register NOW!

The 12th International Business Rules Forum is November 1-5, 2009 at the Bellagio, Las Vegas and this year's theme is All Around Decisioning. The 2009 Conference Program is now available and you can register for the Super Early Bird by June 30! Take advantage of the Super Early Bird and get 5 days for the price of 3 or the 3-Day Conference for only $1,295. More details over on my other blog.

Posted June 11, 2009 5:43 PM
Permalink | No Comments |
No sooner had I decided to cross-post one product review from my main blog when a second one seemed worth doing. This time it is my review of Lyza.

I got a chance to see Lyzasoft's new product in action recently. Lyzasoft aims to provide a desktop product for business people to do analysis that can seamlessly scale up, unlike (say) spreadsheet based analysis. The product is based around a column store.

Workbooks are the core metaphor and these are used to assemble flows. Data connections are the first step in these flows and can be created from Access, text files, Oracle database etc. Users can drag and drop various elements - a stack of queries, perhaps, that are linked. Data is then sucked into the column store. A nice drag and drop interface allows joins, appends etc to be added. Each node in the workbook flow consists of Input - instructions - outputs and it is easy for users to chain these together. For each node the user sees input data at the top for the sources being manipulated. Simple operations and drag and drop can then be used to take action. For instance, similar columns can be dragged so that the tool knows that they can be stacked. Users can also set default values, define formatting and more as they work on the data. It is easy to add filters and other transformations and Excel-like formula building in column definitions allows things like "previous purchase" to be defined as a column. Nodes include summarization (non destructive), filtering (destructive), calculations (additive), joins (could do anything), sourcing decisions and more.

The tool is designed to handle large data sets and flag issues (like missing data) automatically. It takes seconds to import millions of rows and it is very quick to display results, filter down by values, summarize etc. Everything is designed to make it possible for non-techies to work effectively. The join node, for instance, has nice visual clues using a Venn diagram and handles conversions of data elements so the join can be defined. The speed allows constant visual feedback for users so can see the results of an action, decide if that is what they want / expected and either undo or continue. They do not have to worry about the technicalities - is this an inner or outer join for instance?

Users can build nice graphs and generate trace documents from XML specification of the environment. Everything is traceable and visible. If a user wants to build on someone else's work and have access to their analysis they can see the trace all the way back. This means any shared analysis is understandable and the traceability is one of the product's best features. In addition this XML-based information specification can be moved to a server based environment. This allows companies to bridge ad-hoc, end-user centric analysis to IT. No re-do. No spreadsheet brittleness, very nice as this allows people to answer the question "What's in that number?" - the derivation of summary information is key and is made visible by the product. The tool also allows "re-parenting" so that a temporary source (say a file dumped out of a database) can be replaced (say with a live connection to the data). This is a powerful feature for creating the seamless promotion from end-user to centrally managed.

There is a web services API for the flow and access to enterprise databases in the enterprise version and a light version without enterprise connectivity or APIs. In addition there is a commons version for brokered peer to peer sharing of analysis. Servers can allow analysts to create pub/sub relationships with each other to share analysis and these can be monitored. The intent is to make it possible to manage analysts, replace people who quit, update schedules and so on. Cut and paste replaced with links through a shared commons. They are adding a web front end so that non-Lyza users can consume/comment on reports and analysis also.

They are adding some stats and analytics e.g. stepwise regression but there is clearly more to come.

I really liked both the ease of use and the way in which end users are brought into the tool without being condemned to a marginal existence - the same analysis can be created locally and then shared effectively as part of a real IT system. The traceability and the declarative nature of the tool were both great.

Posted June 10, 2009 7:01 PM
Permalink | No Comments |
I regularly post "first looks" at new products on my personal blog and my review of the new release of myDials seemed worth cross-posting to BeyeNetwork.

myDials is a company focused on optimizing operational performance by delivering timely, relevant, actionable performance metrics, contextual information, guidance and "every person" analytics. On June 8th they announced myDials 3.0. myDials is focused on helping all employees focus on the monitor/analyze/adjust cycle that helps ensure that operations remain in synch with company strategies and plans. They feel that all employees must participate in this process and this is tricky because you need to view financial and operational data intuitively. Most companies end up with lots of spreadsheets and inconsistency as well as after-the-fact analysis. In addition, Key Performance Indicators or KPIs need to be turned into Key Performance Drivers and these need to be compared with targets to find variance. This variance needs to be interpreted and some action taken. myDials functionality includes:

  • Monitoring of KPDs/Targets using dashboards - they have added time context and personalization in 3.0 as well as a metric library
  • Alerting using a rule engine for visual and email alerts
  • Knowledge sharing through embedded information annotations
  • Analyzing - they have expanded drivers, trends, forecasts, control charts and pareto in 3.0
  • Acting - they have added what-if scenarios in 3.0

myDials is a hosted on demand application. The basic view is a dashboard with various tabs. A nice looking collection of gauges and graphs - they call all these Dials - is available and these are collected into ribbons that can be collapsed and expanded. Navigation controls are kept off until you mouse over a dial to keep the interface clean. Dials can show information that describes them with web links to supporting information (inside or outside the firewall) as well as visual alerts, a nice feature.

Dials can be expanded and users can drill into the relevant dimensions. The expansion mechanism is nicely implemented, keeping a train of context - a drill tree if you like. Notes can be added to data points and these will be seen whenever the data is used in a dial - a nice way to share information about what is going on. The context for the whole dashboard can be changed (from worldwide to Europe, for instance) and a set of breadcrumbs is displayed to show where and when you are in the data. Sliders and other controls can be used to move around the time period or to see quarterly or annual roll-up or weekly/daily drill-down for instance.

For each dial a set of drivers can be defined. When a dial is in an alert condition you see which drivers are out of range and can drill into this to see what is contributed to being out of range. These drivers can be defined with complex expressions and different zones (Critical Zone for instance) can be specified using a formula. Alerts can be specified in terms of conditions using the expressions (including calculations across multiple metrics and periods) and notification rules can be specified similarly.

myDialsUsers can enter values directly to create a "what-if" scenario too. The dial then responds to the what-if data and shows what the impact would be (all dials impacted by the what-if analysis show a what-if symbol). Multiple points in a dial can have what if data added to simulate trends. Dials can also automatically display analysis lines across the dashboard. Trend lines, appropriate to the kind of data, are displayed. The kind of trend analysis desired is specified as part of the configuration and then on-the-fly analysis is performed on the data in the dial to display the trends.

myDials is focused on manufacturing, energy, mining particularly (and has some features that support these industries nicely) but is seeing growth in government and healthcare / hospital management.

Like some other dashboard tools I have seen recently, the folks at myDials are keen to ensure that users can take action in response to what they see. I also appreciated the efforts to bring predictive analytics/trending to bear in a way that would make sense to a user and their focus on operational decisions. Because these trends can be used in the alerts users should be able to define alerts that will be triggered when something is about to go wrong, not just after the fact. As always I would like to see more ability to automate the actions being taken as a result of changing data but overall myDials is a nice looking product with a good attitude.

Posted June 8, 2009 9:25 AM
Permalink | No Comments |
Last week I posted Use decision management to make systems smarter and I got two interesting comments. Ronald made an excellent point first:
I don't think the lack of deep analytic tools is the prime reason for BI not being intelligent. In my opinion it's the lack of education and skills concerning analytical methods and thinking in the individual as well as the lack of 'an analytic culture in the organization'- which Davenport also writes about.
This is, of course, true. People tend to look at data in a fairly shallow way and all too often make decisions based on their gut rather than on rigorous analysis of data anyway (as discussed in this post To Hell with Business Intelligence, try Decision Management). George followed up on this by saying:
Going deep to analyze the data available to the organization is great, but if the user getting the output of that is not trained or capable of understanding how to use that information, it is all a waste of effort.
Now while I agree with the two of them I would say one thing about decision management - the users of a decision management system don't need to know how to use the information or need to have analytic skills. Indeed that's part of the point - the analysis of the information, the insight that can be derived from it, are embedded into the system and the user simply gets an answer (perhaps a yes/no answer, perhaps a price, perhaps a range of options from which to choose). You still need analytic skills, you just don't need them in the application's users.

IBM used a neat phrase when they launched their recent business analytics service line - from decision support to action support.

Posted June 8, 2009 7:56 AM
Permalink | No Comments |
PREV 1 2