We use cookies and other similar technologies (Cookies) to enhance your experience and to provide you with relevant content and ads. By using our website, you are agreeing to the use of Cookies. You can change your settings at any time. Cookie Policy.


Prescriptive Analytics: Making Better Decisions with Simulation

Originally published January 21, 2014

Continuing on our journey into the world of prescriptive analytics, we are going to look at Monte Carlo simulation. For many years Monte Carlo simulation has been strategically applied in financial services, military, healthcare, utilities, transportation, research and a wide array of other industries to aid in complex decision making where many factors have inherent uncertainty. From variable supplier costs, unknown market demand, competitor pricing changes and other kinds of unknown factors that influence an outcome, Monte Carlo simulation can be used to better understand the full magnitude of possible scenarios for planning, making decisions and mitigating risk. Result accuracy and insights gained from using this specific analytic technique far surpass classic what-if scenario analysis. Monte Carlo simulation arms decision makers with objective probabilities for all possible outcomes, empowering informed decision making in uncertain conditions.

What is Monte Carlo simulation? It is a stochastic analysis technique that applies value ranges for non-deterministic or unknown variables with probability theory to create thousands of “what-if” scenarios. Computer automated simulation models substitute the analyst provided range of random variable values according to expected variables statistical probability distributions. It is the variable probability distributions and evaluation across all possible combinations of these variables that allow for realistic evaluation of scenario uncertainty.



Figure 1: Oracle Crystal Ball Monte Carlo Model Variable Probability Distributions

Successful Monte Carlo simulation does depend on analysts being accurate with assigning worst and best case values. Although historical data is often used as a baseline, extreme conditions may not be present in that data, resulting in a flawed model. For example, following the stock market crash of October 2008, quite a few businesses went bankrupt or had to reinvent operations since business models never considered a deep recession in worst-case model variable values. According to Douglas Hubbard, author of How to Measure Anything: Finding the Value of Intangibles in Business, the frequency of rare catastrophic events is much higher than most models assume. “If I fill a bucket with dice and roll them, this activity will yield what we know as a normal distribution. But most of the risks we worry about modeling for in the financial world do not behave this way. Financial markets behave more like earthquakes, forest fires, and tsunamis. Their interrelated components mean that the whole system can be stressed. The failure of one component causes the failure of many other things. The single biggest risk for any organization—or nation—is the lack of validating risk analysis methods. Precautions or analysis of financial volatility are useless if inadequately assessing risk to begin with.”

Simulation over the Last 20 Years

In the real world, Monte Carlo methods are often applied to assess business investments, multifaceted projects and even gauge large project scheduling risks. I first came across this analytic technique when supporting an engineering manager at a medical device company back in the early ‘90s. Engineering management was responsible for medical device hardware purchasing, software development, pilot device builds, testing, compliance and getting a unit into production. Those dependent processes involved many variables, significant risks of engineering design failure and high stakes investment. In many cases, it would not have been at all feasible to experiment with engineering designs. To aid C-level executives in making the best, “big bet” future decisions, Monte Carlo simulations were used to communicate various product development best case and worst case scenarios, potential schedule change impacts and financial gain/loss considerations.

Today Monte Carlo simulations are still used in the same manner. As organizations get more sophisticated with analytics and society continues to embrace the age of big data, I expect to see this technique used more often than it has in the past. Many MBA programs now teach Monte Carlo techniques within their fundamental curriculums. What has changed is the array of available modern tools to perform Monte Carlo simulations and exponential increases in computing hardware performance allowing for much faster simulation evaluations. A few popular software packages used for Monte Carlo simulations include @RISK, Oracle Crystal Ball, Frontline Systems Solver Platform for Excel and TIBCO Spotfire Enterprise Runtime for R. You could also build your own solution with open source R.

Sensitivity Analysis

One of the most useful, actionable aspects of Monte Carlo simulations is getting deeper insight into the specific uncertain variables that have the most influence via sensitivity analysis. Sensitivity analysis provides a ranked shortlist of variables that have significant impact on various outcomes. The knowledge gained from sensitivity analysis alone helps decision makers know key influence variables to target in decisions versus wasting time and money on minimally relevant areas.



Figure 2: Oracle Crystal Ball Sensitivity Analysis

The Bigger Picture

Computer-based Monte Carlo simulation allows analysts to test and improve decision-making abilities quickly, easily and cost effectively without real-world experimentation consequences. Monte Carlo models can be tested with historical data if it is available, but usually these models are used in forward-looking, unknown situations. By using a prescriptive analytics process and techniques like Monte Carlo simulation, you not only get insight into what could happen in the future, but also invaluable insight into what actions can be taken that truly do make a difference and mitigate risks. If you analyze data for a living, Monte Carlo simulation should be in your repertoire of analytical skills.

  • Jen UnderwoodJen Underwood
    Jen Underwood has almost 20 years of hands-on experience in the data warehousing, business intelligence, reporting and predictive analytics industry. Prior to starting Impact Analytix, LLC, she held roles as a Microsoft global business intelligence technical product manager, Microsoft enterprise data platform specialist, Tableau technology evangelist and also as a business intelligence consultant for Big 4 systems integration firms. Throughout most of her career she has been researching, designing and implementing analytic solutions across a variety of open source, niche and enterprise vendor landscapes including Microsoft, Oracle, IBM and SAP.

    As a seasoned industry presenter, author, blogger and trainer, Jen often volunteers her time and gives back to the global technical community in many ways. Recently Jen was honored with a Boulder BI Brain Trust (BBBT) membership, a 2013 Tableau Zen Master (MVP) award and a Dun & Bradstreet MVP. Jen holds a bachelor of business administration degree from the University of Wisconsin, Milwaukee and a post graduate certificate in computer science -- data mining from the University of California, San Diego.

    She may be contacted by email at jen@impactanalytix.com, and her blog can be found here.

    Editor's Note: Find more articles and resources in Jen's BeyeNETWORK Expert Channel. Be sure to visit today!

Recent articles by Jen Underwood

 

Comments

Want to post a comment? Login or become a member today!

Be the first to comment!