Barry, self-service business intelligence (BI) and data discovery are big in today’s analytic thinking and marketing, yet I hear you talk a lot about a process being needed for decision making. Are you bucking that trend?
Ron Powell, independent analyst and industry expert for the BeyeNETWORK and executive producer of The World Transformed FastForward Series, interviews Dr. Barry Devlin, founder and principal for 9sight Consulting, to discuss one of the essential components for a production analytic platform: a process for decision making that has been given too little attention by enterprises when it comes to analytics.
You know, Ron, I’m a consultant so the answer is “it depends.” But the longer answer is that business really does need agility. It needs the ability to ask any question to do any analysis, and in that sense self-service BI
and data discovery are really very important today. Where I differ a little is that I feel there is a need to have some safety rails for businesspeople who are doing these types of analyses to make sure that they are using the right data, that the data is of sufficient quality and that it comes from good sources. I think blending and mixing those two requirements – the requirement for freeform analytics and analysis and managing the data quality
and getting the right data – requires us to think about decision making as a process. I call that process a decision cycle in order to talk about it differently to make sure that we understand that it is a cycle in the heads of the users and also within the organization or system that supports it.
Can you describe this business cycle?
Barry Devlin: When I talk about the decision cycle, I use an acronym that I made up quite some time ago. It is MEDA, and that stands for monitor, evaluate, decide and act. This is a closed-loop sense-and-respond approach. I’ll take you through those four steps so you can understand how they work.
The first step is monitor, and that is monitoring whatever is happening inside and outside the enterprise. Of course, in the past that was always about running the operational systems and then taking the data out of them into a BI system and so on. But today monitoring is much broader because we’re getting so much more data from outside the organization from the Internet of Things, from social media, and from all types of environments. Monitoring becomes a much bigger issue when we talk about this modern environment.
Evaluate is about evaluating the implications and the consequences as well as what you think you’re going to do based on what is possible. That part of this cycle is typically supported by BI tools. Increasingly it is going to be supported by the data lake as we move forward, but essentially this is where we’ve focused our attention as BI providers.
Decide is deciding among possible and recommended courses of actions. This is where we actually come to some conclusions, and often it is a process that involves collaboration, meetings and so on. Again, as we look at data that is coming into the business from the outside world, I think we’re going to see a lot more emphasis on automating some of those decision-making processes.
Finally, act is changing behaviors, changing the processes, doing whatever needs to be done to make the cycle work, and then linking it back to monitoring. And this is the closed-loop piece, which is really important. And, in fact, it is something we generally don’t do very well today.
Those are the four steps in the decision cycle, and it is very important that we look at them as a closed-loop sense-and-respond type of environment.
With MEDA, are we looking at this as a place to start? Should we be starting with the decision cycle?
Barry Devlin: The decision cycle is a wonderful place to start thinking about how we want to support decision making within the organization and help people make decisions in a structured way. As I said earlier, it keeps the safety rails in place. MEDA is the place where I tend to start thinking about decision making and where I start thinking about how that fits into the modern environment.
How does this fit with the production analytic platform we’ve talked about in previous podcasts?
Barry Devlin: The short answer is that the production analytic platform gives us a way to manage the complexity of this environment. What I’ve said in previous podcasts is that if you look at the modern world with analytics, BI, operational systems and so on, it’s a very complex set of data flows. The production analytic platform is trying to simplify those data flows by bringing them all together in a good environment. Maybe the easiest way to talk about this is to go through an imaginary example of a high-speed train service. It’s a high-speed train service that goes across Europe, departing from sunny Madrid and its final destination is Copenhagen. It’s going to stop en route in Paris and Frankfurt, and it’s going to be reaching top speeds of 120 miles per hour. It is going to be heading across the Spanish plains and spewing not smoke, but data. That data will be describing every aspect of its performance – multiple streams of data, events and measures. They are being sent to a data processing center in Berlin, where the progress and performance of all of this company’s trains are monitored in real time. This time-series data is all about detailing the rotation speeds of the axles, the temperatures, the vibration levels, the lube pressures and so on for every wheel on the train. It is coming from different systems and sensors from different manufacturers. So we have to bring all this together in different formats. Effectively monitoring this performance is where we start with the production analytic platform – monitoring the world as it is evolving. Using SQL and other tools, business users and applications can monitor all of this as well as monitor the traditional operational data that is in the same environment. This is monitoring – it happens within the production analytic platform.
As our train is heading across Europe, eventually something is going to go wrong, as I am sure you can imagine. So we need to be able to look at how that might happen. The starting point for all of this is always about developing models that correlate abnormal measures, increasing temperatures, vibrations and unexpected situations, and that is usually done in high-performance analytic data lakes, combining data from multiple places. In this particular example, of course, we’re going to have a problem – otherwise there wouldn’t be a story. As the train is crossing into France just after midnight, hydraulic pressure has risen and temperatures are rising on one of the wheels on carriage 12. This is a situation that we need to deal with, and we need to understand the potential implications of what is likely to happen. There is a 52 percent probability of failure by the time the train reaches Paris. It is rising to 78% by the time the train arrives in Frankfurt. Something has to be done – the wheel has to be fixed or the carriage pulled from service before it arrives at its destination.
This is our evaluation and decision-making process using models, using analysis and finding possible answers with all types of tools like simple BI, SaaS or Jupiter scripts and so on that we see, for example, on the recently introduced Teradata Analytics Platform as a way of doing this.
So we’ve come to the situation where we have some possible decisions, and now it’s time to take an action. Here is where it all comes together because we’re going to be looking at this data coming in in real time, but we really want to make some decisions about how it is going to impact the passengers on the train and how it might impact the service around this train. For example, if we have space on the train, we might be able to move people around in the carriages and that would allow us to simply pull the faulty carriage from the train. But we might not have space on the train. This is where we see the value of blending traditional operational business data with the data that is coming in from that train that indicates the problem. So we get to make decisions based on whether a spare carriage is in the right place if we have to pull one out and push one in, whether we have engineering staff with the right skills if we have to make a repair, and so on. All of those considerations are the MEDA loop in action, all happening within the production analytic platform. That is really an example of how it might all work.
Great example that encompasses IoT, the Internet of Things – bringing all that data together in real time.
Barry Devlin: It is classic IoT and also a bit futuristic because, as you may or may not know, it is actually impossible to get a single train all the way across Europe. The countries don’t talk to one another that well. They also need a MEDA loop for better decision making!
Sure. In summary, why is this so important?
Barry Devlin: The real importance here is that we’re looking at an environment, and the example I have just provided is one of many where timeliness, consistency and cross-organization integration all need to be done in real time. Data needs to be brought together. Decisions need to be made. There needs to be tight integration between the different sources and targets as well as between the different people involved. It gives us the possibility of making sure that our organizations and our people don’t drop the ball. In the case of the example I’ve just given, if we don’t have all of the different data from different sources coming together at the right time in the right place, we may end up pulling this train out of service when we didn’t need to. Then we have a whole lot of unhappy customers in the middle of the night who are stranded in France although they were hoping to get to Copenhagen. That’s not a good situation to be in.
Not at all! Barry, I want to thank you for discussing the importance of a process for decision making, which I feel is essential for analytics, and the production analytic platform.
SOURCE: An Integrated, Closed-Loop Decision Cycle
Recent articles by Ron Powell