Applying Analytics to Sensor Data

Originally published February 14, 2017

Ron Powell, independent analyst and expert with the BeyeNETWORK and the Business Analytics Collaborative, interviews Chad Meley, vice president of product and solutions marketing for Teradata. They discuss how IoT accelerators help companies be more effective in their sensor-data analytics.

Chad, you’ve had some recent announcements regarding the Internet of Things (IoT) or IoT accelerators. Can you expand on those?

Chad Meley: Let’s start first with the term accelerators. What we mean by an accelerator is a combination of technology-agnostic intellectual property and professional services that are really harvested from proven field engagements with some of our larger customers. What this allows an organization to do is really benefit from proven techniques, whether it’s how to model the data or the right KPIs, and the types of analytics to apply against a specific business problem. That will help them reduce the time to value, save costs and take risk out of the project. That’s an accelerator in its broadest sense. Specifically, what we just announced are four accelerators for what Teradata calls the Analytics of Things. It’s taking the Internet of Things and applying analytics against the sensor data. What we found is that a number of our clients and prospects are struggling in a couple of areas as it relates to the Analytics of Things. 

One area is that they’re realizing that analysis against sensor data is a whole different animal. It’s happening at a different scale. The sensors are creating time-series data, and that can be new and foreign to a lot of organizations. Also, these analytics are increasingly becoming location-aware. It’s bringing in the added dimension of geography. It’s not good enough for a company like Caterpillar to know what the oil pressure is on a 400-ton truck. They also need to know if it is in a jungle in South America or is it at altitude in the Rocky Mountains. These things play a heavy role in terms of correlating them. 

The other concern that a lot of our clients are challenged with as it relates to sensor data is trying to understand what sensor data to trust and keep. If you think about it, a lot of sensor data can be erroneous. We’re talking about sensors that cost pennies now. In reality, a lot of them can become corrupt and start sending false signals. In fact, yesterday I saw a demo on a piece of technology. They were streaming some data, and they set up sensors around the room that were taking temperature and noise readings. You could watch them go up and down. One of the sensors just started saying it was below freezing in the room. Everyone knew it was wrong. What this illustrates – and it’s a small example of what we’re bringing to market around sensor data qualification – is being able to correlate that with the other readings in the room to determine if it’s noise and no action should be taken. 

Another concern that companies are struggling with as it relates to sensor data is determining the frequency to begin collecting the data. Should a temperature reading be taken five times a second, once a second, once every 10 seconds, or once a minute?  These are important questions because if you collect too much data, you’re going to kill yourself on cost – storage, processing and network costs. And if you are collecting too little, you might miss those key signals that could really lead to an insight.

So one of our Analytics of Things accelerators is for sensor data qualification. The accelerator is   helping organizations figure out the right analytics to apply and the rules on the sensor data itself to help them figure out what’s noise, what’s valuable and at what frequency the data should be collected to map to their analytics agenda. 

Once the sensor foundation is in place, it is necessary to determine the right combinations of analytics and transformations to address specific business problems. We just rolled out two accelerators relative to that. One is around condition-based maintenance. Condition-based maintenance relates to expensive, complex assets – maybe it’s a jet engine, a locomotive, or an MRI machine.  These are heavy pieces of equipment that for safety reasons or monetary reasons need to be kept running. Unplanned downtime is either a safety hazard, as in a jet engine; or it carries financial implications, as in the case of an MRI machine. Predictive and condition-based maintenance is all around predicting the likelihood of having an unplanned failure. Now sensor data taking temperature readings, vibration readings can help provide insight into pending failure. What a company can do with that knowledge is they can either decide to take some proactive maintenance – maybe it’s swapping out a particular part that will lower the likelihood of failure. Or they might decide they’re not going to take action on that machine and bring it down; but knowing that there is an increased likelihood of failure, they can make sure they have inventory in that area. Then, if it does go down, it will go down for three hours, not three days. The other thing that companies can do after looking at failures across many different types of machines is determine the right maintenance protocol so that they can lower the likelihood of failure across all of the machines.

Another accelerator has to do with manufacturing performance optimization. In fact, I was just with a company called Valmat. Valmat is a Finnish company, and they make machines that make paper. That doesn’t sound that interesting until you start seeing the imagery of one of what one of these machines looks like. Each of these machines is the size of a football field. Logs are fed into one end of the machines. At the other end, there are 60-ton rolls of paper that the equipment produced. All of that transformation is happening from grinding it down to pulp to turning out the finished high-quality product – in the right color and the right width. They are looking at how they can make the equipment as efficient as possible. 

One example that Valmat provided is how they make it so the machine is consuming as little energy as possible. By collecting data right off of this large piece of machinery, they are able to gain those insights to then make recommendations to the owners of the equipment. Another example they gave is that they have these consumables that are part of making paper – whether it’s belts and hoses or liquids that go in. The overall worldwide market for paper mill consumables, according to them, is 2.5 billion Euros. And through analytics, they think they can reduce that by 20 percent, which is absolutely huge. In fact, that’s now turning into a sizable part of their business. Right now their business is selling a really big capital expenditure piece of equipment for making paper. But now they can ask the purchaser what they’re spending on consumables. They offer to advise the purchaser on the analytics and bring that spend down. In fact, what they’re pitching to their clients is a share-based risk model. They think they can save 30 million Euros a year on consumables reduction. They offer to do the analytics for free and split the savings. It’s a very interesting new business model where they’re taking the data and turning that into the service. 

The point of that example is that through working with companies like Valmat, we’re able to harvest that IP. We can take the kinds of analytics and transformations that are required to get insight, and then package it up to other industrial companies and accelerate their journey to getting similar value from their analytics.  

So are you marketing an analytics application accelerator? What is the customer buying?

Chad Meley: Excellent question. The way that we’re going to market with these is as a service. But rather than the service being six months long, we shave a lot of time off of that because we’re coming in with pre-built accelerators, and that’s explained to the client. Another part of our overall strategy that we recently announced is a complete set of downstream packaged solutions. Where now you might be buying software with a maintenance agreement and a support contract. But what I’m describing as an accelerator includes more than just services. It includes templates for the data model, for the transformations and for the visualizations that would occur from that, that again are really just accelerating the traditional professional services engagement. 

Wow – that’s quite a powerful offering. And you said there are four accelerators?

Chad Meley: Yes, there is one other called Visual Prospect Accelerator. This one is similar to the one I was referring to earlier around sensor data qualification in that the ultimate objective is to help an organization figure out what sensor data they should focus on. And, I know we’re on a podcast and this has “visual” in its name, so the way to really experience it is to see it, but I’ll try to do my best. I’ll use Caterpillar as an example. They have hundreds, if not thousands, of vehicles. These vehicles have transmissions. These transmissions fail on occasion. Part of a transmission is something called a skid plate. So if you had sensors on every skid plate as well as other components of the transmission and you could get these readings, then you could start looking at failed transmissions. You could first determine if a skid plate sensor reading is predictive of a transmission failure. By the way, it doesn’t just end there. You’re not just looking at raw data. There are 20 different transformations that could be done on a skid plate reading for a particular transmission. You could do a four-year transformation or a moving average. So what this visualization accelerator allows you to do is it shows all the different sensor readings for, in this case, a skid plate reading on a transmission. You can toggle back and forth between the types of transformations that led up to a failure, and then you can see which ones started going crazy and looked interesting preceding that failure. You then know what to focus on and what not to focus on. For certain kinds of transmissions, you might want to keep that data for a while and apply that transformation rule. Other types of transmissions or moving averages on different sensor data – if it’s not interesting, why bother collecting it at the frequency or storing it for the duration that you would for interesting data. 

Are these accelerators predictive or prescriptive? How would you quantify them from an analytics perspective?

Chad Meley: For condition-based maintenance, they are absolutely predictive. That’s the whole point – what is the likelihood of this particular asset having a failure in the future. For manufacturing performance optimization (MPO), it actually runs the gamut. So it starts with a very descriptive set of metrics around something we call an OEE score – an overall equipment effectiveness score. It is expressed as a percent. So if a factory is running at 100 percent, it means that all of the manufacturing equipment has 100 percent uptime. It’s running at 100 percent peak performance with zero defects associated with it. That’s unachievable for the most part. The worldwide average for OEE scores is 60 percent. Even best-in-class manufacturers are at about 85 percent OEE.

So the MPO starts off with these descriptive measures around an OEE score – what happened in the past and why is the OEE score at, for example, 68 percent? Is it quality or performance related? Again, very descriptive. But then it also does some root-cause analysis, which gets into that diagnostic layer and indicates that if you want to improve quality or performance, we’d recommend looking at pulling these levers. Then it even does some predictive analysis that says if you were to make these changes, this is what your OEE score could be in a week or a month.

Excellent. Chad, thank you so much for sharing with us the IoT accelerators and the value that they are bringing to your customers. 

  • Ron PowellRon Powell
    Ron is an independent analyst, consultant and editorial expert with extensive knowledge and experience in business intelligence, big data, analytics and data warehousing. Currently president of Powell Interactive Media, which specializes in consulting and podcast services, he is also Executive Producer of The World Transformed Fast Forward Show. In 2004, Ron founded the BeyeNETWORK, which was acquired by Tech Target in 2010.  Prior to the founding of the BeyeNETWORK, Ron was cofounder, publisher and editorial director of DM Review (now Information Management). He maintains an expert channel and blog on the BeyeNETWORK and may be contacted by email at rpowell@powellinteractivemedia.com. 

    More articles and Ron's blog can be found in his BeyeNETWORK expert channel. Be sure to visit today!

Recent articles by Ron Powell



 

Comments

Want to post a comment? Login or become a member today!

Be the first to comment!