Blog: James Taylor Subscribe to this blog's RSS feed!

James Taylor

I will use this blog to discuss business challenges and how technologies like analytics, optimization and business rules can meet those challenges.

About the author >

James is the CEO of Decision Management Solutions and works with clients to automate and improve the decisions underpinning their business. James is the leading expert in decision management and a passionate advocate of decisioning technologies business rules, predictive analytics and data mining. James helps companies develop smarter and more agile processes and systems and has more than 20 years of experience developing software and solutions for clients. He has led decision management efforts for leading companies in insurance, banking, health management and telecommunications. James is a regular keynote speaker and trainer and he wrote Smart (Enough) Systems (Prentice Hall, 2007) with Neil Raden. James is a faculty member of the International Institute for Analytics.

November 2009 Archives

I hosted a panel last week on predictive analytics at the Business Analytics Summit. I was joined by Richard Boire of the Boire-Filler Group, Jean-Paul Isson of Monster.com and Michael Berry of Data Miners (and author of Data Mining Techniques, one of my favorite Data mining books). I asked a series of questions and we got some great answers from the experts:
  • How is your organization using predictive analytics and what has been the business value of doing so?
    • Richard's customers use predictive analytics for acquisition models that target not only high responding prospects but also prospects that will be of high value to that organization once they become customers, retention Models that target high value, high risk customers and upsell Models that allow organizations to target customers most likely to become higher-value type customers among others.
    • Monster.com uses predictive analytics for customer intimacy, customer satisfaction, customer retention, customer up sell and wallet share growth, customer acquisition, pricing, sales coverage optimization and product development.
    • Michael focused on some more general points. He made the great point, if basic, that all predictive analytics focus on the future because that's the only place you can have an effect and also pointed out that the business definition is critical. For instance, predicting which acquisition channel has the highest trial subscription sign up likelihood is potentially much less useful that predicting which channel is most likely to acquire customers that will keep a subscription beyond a trial period.
  • What are the challenges you have faced implementing Predictive analytics in your business?
    • Michael emphasized cultural and educational challenges - that this is a new way of doing things and companies often resist things that are not "our way". The inability to find appropriate data in the right format was another big issue.
    • Richard talked about obtaining buy-in and engagement from key stakeholders, the challenges of data and the value of having the right team to effectively implement predictive analytics. The absence of numeracy, of basic understanding of the power and limitations of the models was another big challenge.
    • Jean-Paul also emphasized data quality and availability, especially because different countries and systems define things differently. A lack for application systems integration and standardization and of effective change management across regions can also be a problem, though the recent recession has helped with the second by making people more receptive to anything that might help.
  • How did you sell predictive analytics - how do you demonstrate the value of predictive analytics to the various stakeholders within your business?
    • Richard suggested conducting sensitivity and business analysis to demonstrate monetary potential of project as well as identifying stakeholders who are engaged with the data and working with them to prove your case. A project that is a quick win in terms of ROI and implementation also really helps.
    • Jean-Paul emphasized taking baby steps - starting with the basics and always have something meaningful to deliver. Showing the ROI of a model on a small group of customers (a smaller country or region for instance) also really helped.
    • Michael said to focus on showing how the model will help them do what they do and made the point that he often finds he is the first to look at the data, putting him solidly into discovery mode. Like Richard and Jean-Paul he emphasized the importance of linking everything to real monetary measures.
  • With predictive analytics being such a hot topic, what do you think holds companies back from embracing and exploiting these techniques?
    • Richard felt that a lack of knowledge combined with a discomfort around mathematics and numbers was a big problem. Change management and adopting a new approach also cause problems.
    • Michael emphasized a lack of executive support and the need to get enough support to overcome organizational inertia. He also had a great example where existing measures can make adopting a model hard because the model will drive better overall results while driving a critical measure in the "wrong" direction.
    • Jean-Paul talked about the lack of understanding/knowledge of the real value of predictive analytics also. The attitude of old school management that they are already successful so why do they need to change and spend more money. Bad experience with IT solutions over the years and the communication skills of those proposing the idea sometimes don't help either.
  • What skills set are required to achieve success with predictive analytics?
    • Richard emphasized importance of learning about the business domain, both so that effective models can be developed and so that the models can be related to measures that matter to business executives. Obviously strong quantitative/mathematical background and an ability to work easily with numbers as well as good communication and interpersonal skills were also needed.
    • Jean-Paul said that a wide variety of skills are required with programmers, statisticians and data miners, business analysts, and web developers needed to deliver the solution to end users.
    • Michael pointed out that intuition and creativity - an ability to see what's important - is necessary also.
  • We wrapped up with the question what does it take to operationalize predictive analytics, to integrate predictive analytics as a regular business discipline? What are the pitfalls?
    • Richard talked about discipline, repeatability, as well as tracking and performance management. A ruthless focus on the business implementation model is also key.
    • Michael reminded us that actionability is critical - if we cannot act and act effectively on a prediction then it does us little good.
    • Jean-Paul said it takes a clear vision, human capital, collaboration, people/process/technology and a focus on the customer/user experience.
I really enjoyed the panel and I hope I have captured its essence here. If predictive analytics interests you, and it probably should, check out this white paper I wrote on Putting Predictive Analytics to Work and this webinar I recorded with Eric Seigel on Optimizing Business Decisions with Predictive Analytics. Cross-posted to BeyeNetwork and ebizQ, both of whom were media sponsors of the event.

Posted November 16, 2009 7:54 AM
Permalink | No Comments |

I am hosting a panel on Predictive Analytics at the Business Analytics Summit and I got a chance to attend a session beforehand where Dave Stodder presented on performance management and Key Performance Indicators.

Dave began by emphasizing that performance management is both a business and IT issue and that it needs to link people, process and technology. Performance management is focused on how we are doing, why are these things happening and what should we be doing. KPIs are designed to track and measure within a performance management framework. At the end of the day it comes down to Drucker's comment that you cannot manage what you don't measure. As a result Performance management is driven by everything from Balanced Scorecards (Kaplan and Norton), Six Sigma, TQM and more.

Balanced KPIs keep people focused on what they should be doing, what they can do with their information, as well as providing balance between conflicting goals. They should also be based on multiple measures not just financial ones. Performance Management joins Process Management and Decision Management as a "higher power", one of the levers of improved enterprise performance. Performance management also helps bring BI from departmental usage, focused on reporting, to enterprise-driven metrics and best practices.

Three business imperatives drive demand for BI, Performance Management and KPIs:

  • Establish value - find, increase and retain value
  • Manage risk - identify, predict and protect
  • Rebuild worker productivity - measure, manage and enhance by focusing workers on core issues

Performance Management has some overlap with operational BI. The move to an operational focus, from traditional BI to more operational BI, is often focused on improving efficiency and customer service. Getting rapid access to accurate information improves efficiency and improves customer service. This pushes more BI functionality out to front-line employees, business processes. The use of performance management and a focus on metrics can help focus this and simplify it by allowing IT to deliver simple metrics rather than complex reports.

At the same time the focus is moving to centrally managed approaches, where departmental systems were more common in the past. The move to operational BI and metric-driven performance management is driving centralization. This delivers consistency, cheaper/faster integration and makes it easier to implement data mining and predictive analytics.

Improving customer service, focusing on customer metrics, is the most important objective for 60% of folks surveyed by Ventana Research. This focus on customers means more sources (because customer data is scattered) and on more real-time data (because customers keep doing things). Interestingly customer contact centers are emblematic of the challenges with KPIs. For instance, though the agents are measured and managed very tightly, their supervisors are not. Supervisors don't feel they can impact customer service directly so they don't see the metrics as relevant to them. KPIs must be designed to match what the person can impact (and, I would add, give them the ability to change the systems that affect the metric when this is necessary).

Accountability is critical for KPIs. Is it reasonable to hold someone accountable to a particular metric? Are the people who come up with the metrics held accountable for their implementation? Who needs to see what - are you keeping people focused on metrics that fall into their area of responsibility?

Proliferation is a second issue. Just like reports, too many KPIs can be distracting rather than useful. Vague implementation without real accountability and control will result in metrics that are just noise.

A central focus on well defined metrics can also really help after mergers and acquisitions - it can be easier to develop integrated metrics than integrated reporting. A focus on metrics also tends to deliver a top-down view not the bottom-up view typical of reporting, and improve alignment.

Human psychology is critical in performance management. Will people focus only on achieving the goal or will they be more thoughtful about what is really needed? Will people understand why those metrics matter? Do people want their performance to be transparent and at what level?

Keep the number of KPIs reasonable, make sure people understand what is driving the KPIs and think continuous change.

Cross-posted to ebizQ and BeyeNetwork as both were media sponsors of the Business Analytics Summit


Posted November 12, 2009 7:04 PM
Permalink | No Comments |
I participated in BeyeNetwork's Radio Show this morning and we had some interesting discussions around analytics and in-memory analytics. Check out the recording of our discussion on business analytics, in-memory databases and the BI maturity curve: http://www.b-eye-network.com/listen/12070

Posted November 10, 2009 3:37 PM
Permalink | No Comments |