We use cookies and other similar technologies (Cookies) to enhance your experience and to provide you with relevant content and ads. By using our website, you are agreeing to the use of Cookies. You can change your settings at any time. Cookie Policy.

Five Things Leaders and Practitioners Are Not Doing in Predictive Analytics (but really need toÖ)

Originally published October 15, 2012

The widespread adoption of predictive analytics has been at the mercy of two opposing forces over the past two decades. Frequent compelling use cases reports from the few organizations that have properly implement predictive analytics projects propelled the discipline into the mainstream. Yet, its perceived complexity has slowed adoption.

Letís take a look at five critical things business intelligence (BI) and analytics professionals are often overlooking, hence depriving their organizations of the substantial benefits of predictive analytics.

#1 Getting Started the Right Way.

Predictive analytics is not another flash-in-the-pan technology. Most Fortune 500 companies have established departments and practices that refine crude data into high-octane intelligence. But few beyond the largest corporations have formalized or even organized their approaches.

Why havenít more jumped into the game? There are many barriers and excuses, most notably:
  • Corporate executives donít believe predictive analytics can deliver net gains;

  • Department heads who would have a stake in the project assume it will require a substantial investment in expensive consultants or a bench of professionals with doctorates;

  • BI managers donít want to take on another initiative; and

  • The business canít get ahead of the deluge of data.
Organizations are simply not getting started with predictive analytics the right way. Those that approach it like any other like any other IT or BI project will not unearth meaningful or measurable results, and predictive analytics will be prematurely dismissed.

Yet as the flow of incoming data gains speed, organizations canít just keep pumping dollars into storing, structuring, cleansing, transforming and transporting data. They need to ultimately uncover the core value: actionable information hidden within their growing data stores. †

#2 Training Before Doing.

Why not approach predictive analytics like most IT and BI projects Ė with an emphasis on technical training?† Most IT projects focus on fulfilling objectives at the operational level; their functions are more direct and tactical. Predictive and advanced analytics, however, rely heavily on strategic assessment, design and implementation. For practitioners and functional managers entrenched in typical IT projects, this requires a significant shift in mind-set.

Most are surprised to learn that the bulk of the training necessary for predictive analytics success is not technical. The mathematics involved can be very sophisticated, but modern software tools automate their complexity, allowing most business practitioners to build adequate predictive models with little training. Although itís helpful to have some statistical grounding and a basic appreciation of the capabilities and limitations of various modeling methods, expertise in actual model development has little impact on the success or failure of a project.

But it is imperative that at least one project manager or functional leader be well-versed in a formal, methodical, process-driven approach to predictive analytics Ė for example, the Cross-Industry Standard Process for Data Mining, or CRISP-DM Ė at the project level. Unfortunately, there are very few courses out there that provide this emphasis Ė particularly from a vendor-neutral perspective. But a search on data mining and predictive analytics strategic training will produce a few good options with a project-level orientation.

#3 Designing Before Building.

If you were building a new house, wouldnít you first meet with designers and engineers to draw up blueprints? What would your home look like if the builder started construction before fully understanding your needs, preferences, and site dynamics? The answer is obvious.

Unfortunately, most organizations start predictive analytics by jumping directly in with software and data and then hammer away on models without understanding what theyíre building. When predictive analytics is not approached as a well-planned process, practitioners may still end up building very good models, but the models answer the wrong questions and canít be interpreted or implemented.

A number standardized processes spell out precisely how to plan and implement a successful predictive analytics project. Two of the major ones are vendor-neutral CRISP-DM and SAS-specific SEMMA.† But vendors are primarily interested in trumpeting and selling technology. Their marketing efforts have influenced misguided organizations to listen to the loudest voices and start with the easiest-to-obtain resource: software.

Even the few companies that are aware of standardized processes are resistant to start with training and follow a formal process like CRISP-DM. When it comes to predictive analytics, nobody wants to have to sell the notion of a comprehensive assessment and resulting project definition. The exercise itself is not sexy.† It requires an up-front investment of time, money and effort. And it will not produce a return on investment.† But then again, neither will blueprints. They will help ensure that your house doesnít collapse, though.

#4 Putting the Problem Ahead of the Analytics.

Many argue that organizations should start with predictive analytics to uncover unknown insights, relationships or anomalies that may direct subsequent analysis. This approach may uncover a few artifacts of interest, but it rarely moves beyond an unsupervised exploratory exercise.

A 2009 survey of self-proclaimed data-mining practitioners by Rexer Analytics revealed a focus on the wrong part of the problem. The majority of participants cited the performance or accuracy of predictive models as the most important factor in determining success. In the real world, however, practitioners are not rewarded for how well a model conforms to artificial metrics Ė but rather for how effectively it helps optimize the allocation and use of organizational resources.

While discovery is a function predictive analytics, the derived information should support the organizationís priorities Ė and not the other way around. The first step should be establishing and validating strategic priorities across functional teams. Justifying the time to properly design an analytics project in these days of Agile BI and immediate results is not easy, but itís mandatory. A comprehensive assessment, formal project definition, and measurement framework must be established for results that are substantive and sustainable.

#5 Avoiding Distractions and Buzz.

Just as the masses moved far enough down the BI chain to embrace predictive analytics as a formal practice and start deriving value, they were diverted to the next shiny, new thing upon hearing exciting terms like big data and data science.

Seasoned BI professionals will recall how chasing hype can lead to costly exploration of uncharted, underdeveloped and oversold technology.

Vendors riding the coattails of the big data analytics buzz may argue that traditional analytics just wonít get the job done in the new world of ever-swelling data sets. When real-time analysis of high-velocity, multidimensional data is required, there may be a need for computational scalability. But applications that call for in-database analytics or distributed processing are certainly not the norm.

When sampled properly, only a small fraction of the data available is required to adequately represent the solution space Ė a mathematical term describing the entire area represented by multiple dimensions Ė and effectively train a predictive model to produce the desired information. Once such models are deployed, running them and scoring large volumes of new data can be done highly efficiently, suiting the vast majority of business applications.

Be wary of the hype. Those who maintain the course toward predictive analytics while gradually building other aspects of their BI chains toward big data scalability are due for early payback.

The Takeaway

So donít cheat the process. Donít rush to grab software and dive headlong into your data. Itís simply not worth it, and that has been proven time and time again.† If you donít succeed with predictive analytics on a small-scale pilot program driven by a sound assessment and project definition, then forging ahead into big data analytics will produce nothing more than a big noise.

  • Eric KingEric King

    After graduating from the University of Pittsburgh with a bachelor's degree in computer science in 1990, joined NeuralWare, Incorporated, a neural network tools company, as a senior account executive. In 1994, he moved to American Heuristics Corporation (AHC): an advanced software technology consulting company specializing in artificial intelligence applications. At AHC, he performed as the director of business development for the commercial services division and the training division, The Gordian Institute. With the valued support of AHC, contractors, customers and family, Eric founded The Modeling Agency in the spirit of establishing long-term professional relationships.† The Modeling Agency's focus is providing guidance and results for those who are data-rich, yet information-poor. Eric may be contacted by e-mail at eric@the-modeling-agency.com.

    Editor's Note: More articles and resources are available in Eric's BeyeNETWORK Expert Channel on data mining and predictive analytics. Be sure to visit today!

Recent articles by Eric King



Want to post a comment? Login or become a member today!

Be the first to comment!