1503-Retroanalytics-for-IOT

With Data, Bigger Is Not Always Better

March 13, 2015
VDC Research analyst offers help for mopping up the data deluge

You know the complaints. Now that we have all this data from all our sensors, what are we going to do with it? It’s expensive to store, and we don’t have time to analyze it all. All we have are piles of raw data. Is this IoT a thing or a joke?

In his report, “Retro-Analytics for IoT: When Bigger Data Is Not Better” Steve Hoffenberg, director, IoT & Embedded Technology at VDC Research describes the problem this way: “Newcomers to the IoT [Intenet of Things] are often tempted to think that all data collected should be sent to the cloud to maximize the potential for big data analytics. . . In theory they are correct. In practice, however, it is impractical if not financially infeasible to operate this way.”

He adds that just because an individual can buy a terabyte of data storage for less than $100, that doesn’t mean storing data from hundreds or even thousands of sensors in, say, millisecond increments, will come cheap. A large factory could yield that much data per hour. And the fact is, all that gathering may do is contribute to the useless data flood.

“The vast majority of raw IoT sensor data is not being sent to the cloud, nor should it be. The majority of it never needs to reside anywhere other than momentarily in RAM on-board the device or a gateway,” he says.

Which begs the question of exactly what data should be saved, and how do we know we’re collecting the right information.

Hoffenberg’s solution is what he calls “retro-analytics,” a multi-layered process of cutting through the data noise. It is a combination of analysis at the local and the cloud level.

“One method to deal with problem is to install gateways to do preliminary analysis at the local level,” says Hoffenberg. Sensor data is fed into what he calls intelligent gateways or local servers. There it is “filtered so that only noteworthy sensor data—data that may be acting abnormally—is sent to the cloud. It is also sent at a local level to the operator as well,” he says, which addresses the time-lag problem. If you set up a system to feed all data to the cloud for analysis, you lose time you can’t afford to lose in time- and safety-critical situations. If a process is out of spec, the operator needs to know now, not when the cloud system finishes the analysis and sends it back to you.

IoT Analytics Types

Courtesy of VDC Report

“It’s a kind of feedback loop,” Hoffenberg explains further. “We’ll stream all of it to a central data system and sort out what the key factors are we need to be looking out for. Once those are understood, you can push those algorithms and other factors back to local level. It’s an on-going process, a way to continually refine the analytics going on at the local level. The advantage of doing it this way is that it’s a much smaller volume of data that needs to be streamed to cloud. Big data is continually refining the data, defining and refining the analysis for the local level.”

Hoffenberg points out that retro-analytics (or effectively using big data for anything) is not a one-size-fits-all proposition. Every situation is a little different. Industry regulatory requirements vary; practices vary; the business needs of each operation vary; the scale of every operation is different. A Ford or Boeing needs to gather and retain a much different set and quantity of data than a Merck or a General Mills.

“A fairly simple system with a small number of machines probably does not require retro-analytics, because the relationships between sensor data and system needs are often obvious. However in a facility with an extensive product line involving dozens or hundreds of machines, the interrelationships between the sensors may not be self-evident to either the system’s designers or its operators,” says Hoffenberg. “Even in systems where you want to send data for a certain time and then continually purged it, in very few cases is it necessary to save all the data.”

According to Hoffenberg, not every company needs big data, but exploring the possibilities it offers can reap rewards, even for the little guy. “A lot of small facilities aren’t going to go to the effort to set up an IoT-based system," he says “But a lot of relatively small companies find that when they do implement these systems, they discover ways to optimize their production.”

And companies don’t have to go it alone to set up the system.  Many companies (exida and Eurotech are just two) offer prepackaged data analytics and services.

“In a factory automation scenario, the equipment vendor can offer IoT or analytic services. More and more equipment vendors are offering this service. From their point of view, it helps everybody. Instead of selling or leasing equipment, they are installing equipment on a service basis and leasing it. From factory owner’s view, it can be more of  an on-going operating expense than a capital expense,” says Hoffenberg.

And it’s not just the major players who can come to the party. “In some cases a small company could be in better position to push that business model, especially if the CEO is behind the idea. Sometimes they are more effective at making it happen,” he adds. “A lot of equipment makers are using some of these services as a back-end. They can preconfigure their systems [to use some of these analytics]. The system is already set up with certain common measurements. Where it gets more complicated is in a large facility with a lot of pieces of equipment.”

Click here to read the report. 

Main Image courtesy of David Castillo Dominici at FreeDigitalPhotos.net