How have analytics affected diagnostic-tool development?

Sept. 6, 2016
Sensing devices embed diagnostics into the digital payload.

It wasn’t too long ago when plant managers complained of too much data. Collecting it was easy, but parsing that data and turning it into something useful was entirely another story. Organizations were drowning in data.

Software companies heard their cries for help and began working on floatation devices in the form of analytics. Now, data has become big data, and manufacturers are looking to use it not only to automate ERP data entry, but to measure productivity and profitability, to virtualize processes and engineering capacity, to monitor equipment health and to gain a strategic advantage over competitors through advanced analytics that turn information into business intelligence.

We asked our panel of nine experts how diagnostic tools have nudged the development of analytic software and vice versa.

Q: How have analytics affected the development of diagnostic tools?

Bob Drexel, ifm efector: Analytics depend on data points coming from devices to be combined with other relational information—for example, sensor value combined with peak/off-peak energy costs—to create a better operational understanding. To accomplish this, sensors will embed diagnostic tools into their digital payloads. Software systems will offload complex algorithms to embedded devices and pull post-processed values. We can expect the typical bell-curve progression for analytic software tools.

They will continue to grow centralized until the complexity becomes unmanageable, and then they will start to decentralize and to offload complexity to the device level. Devices will become more intelligent with more diagnostic tools as centralized systems focus on managing and organizing incoming post-process information.

Bob Drexel is product manager, process sensors, at ifm efector.

Tom Edwards, Opto 22: The growth in analytics and similar diagnostic tools plus the volume of machine data that supports them may have raised the bar for the data a machine returns for analysis. Analytic systems must sift through huge amounts of unnecessary data to derive valuable information—thus, the growing role of smart devices at, near or perhaps in the machine itself that filter and evaluate machine data, sending only important information to the analytics system. In the context of the Internet of Things (IoT), this is called edge computing.

Tom Edwards is senior applications engineer at Opto 22.

Paula Hollywood, ARC Advisory Group: Until very recently, the common cry from the user community was, “We have plenty of data; we’re drowning in it. What we need is a way to analyze the data that we have.” Enter analytics; it’s the tool for which users have been waiting. By applying advanced analytics to production-related data generated from monitoring and measuring the execution of production processes, manufacturers can now realize continuous process improvements faster and with greater confidence. For leading industrial companies, the use of analytics and big data is becoming one on the primary means to outperform competitors. It comes down to a need to leverage information to make better decisions, adapt quickly and compete effectively.

Rather than rely on historical information to make decisions, the dynamics of the current business environment require manufacturers to anticipate change in order to take appropriate measures in time to make a difference. As new technologies and new ways of doing things surface, more companies feel the need to use the solutions and more employees throughout the enterprise want more and better decision-support tools. Advanced techniques can be used with big data to discover new facts and relationships that can potentially be used to business advantage. This is a very significant trend, reinforced by the fact that modern predictive analytics tools don’t necessarily require advanced skills to use them. Instead, business users can now get the information via advanced analytical tools they can use themselves.

Many of the early applications at the intersection of machine learning and IIoT will be in predictive or prescriptive maintenance. This class of application promises a tangible, rapid return on investment through the benefits accrued by helping eliminate unplanned down-time. Machine learning applications are self-modifying, highly automated, and embedded. That is, machine learning algorithms are designed to continuously adapt and improve their performance with minimal human intervention. Machine learning algorithms are also embedded within a process or workflow where they become seamlessly integrated into the process to the point where they are invisible to the user or operator. Machine learning algorithms are in their element solving problems that are too difficult or complicated for human programmers to code.

In asset-intensive industries, one of the fastest-growing applications for machine learning is improving maintenance. The traditional preventive maintenance approach assumes that the likelihood of equipment failure increases with usage. Prescriptive maintenance builds on simple condition monitoring to provide advanced notice of a failure so that appropriate maintenance can be scheduled and performed in advance of the failure. The aim is to give a longer-range prediction of failure, with a higher measure of confidence. Ultimately, the goal is to drive toward zero unscheduled downtime.

Manufacturers use analytics in a number of applications. They benefit by applying analytic techniques to support continuous improvement initiatives, plant performance monitoring, decision support, predictive maintenance, generating KPIs, process control and quality control. Analytics can also be used to identify and correct production anomalies, improve control, and make continuous improvements. The key is finding hidden information that has been overlooked or undervalued and using it to strategic advantage. A modern approach built on data collection and analysis enables manufacturers to develop new techniques that result in greater efficiencies, better yields, and increased production flexibility. This can dramatically improve quality and time to market for new products. Together, analytics, business intelligence and big data provide multiple views of information that enable managers, operators and engineers to collaborate and work together, using real-time data and analysis in an information-driven environment.

Information-driven companies are moving to a culture and business model in which all decisions are made based on analysis of process and business process data. Throughout the organization, these companies employ software to collect, contextualize, visualize and analyze data to gain new insights. Armed with new insights, organizations can anticipate changes and drive better business results. Information-driven companies employ advanced analytics throughout their value networks, business processes and decision-making to support corporate initiatives such as energy management and sustainability programs; global growth initiatives; and innovation in product, process, systems and business models.

Paula Hollywood is senior analyst at ARC Advisory Group.

Brett Burger, National Instruments: Analytics have driven diagnostic tools to become smarter. It is more commonplace now for diagnostic tools to perform analytics on the data and return more useful information, rather than raw sensor data conversion. Tools that are not necessarily smarter may be more open, such that they can tie into cloud/server analytic solutions made by a third party.

Brett Burger is principal marketing engineer at National Instruments.

Joe Van Dyke, Azima DLI: Analytics drive diagnostic tools in a symbiotic manner. Analytic methods and practices determine the optimum data sets to measure, the development and deployment of sensors and the connectivity of the diagnostic system with other data sources and service specialists. In turn, diagnostic tools that bring ability to gather new data types, enhance connectivity or enable faster, more effective and new analytical methods drive improvements in analytics.

Joe Van Dyke is vice president of operations at Azima DLI.

Weishung Liu, Fluke Industrial Group: The ubiquity of wireless connectivity, computing power and lower cost of memory have made possible for test tools to automatically save data to a smart phone or other connection point and then transmit to the cloud where data can be stored and analyzed.

Weishung Liu is product planner at Fluke Industrial Group.

Mitesh Patel, TCS: Advanced diagnostic tools often detect small changes in physical and chemical characteristics of systems. These are often very discrete and require complex mathematical calculations to arrive at any meaningful reading. This has been done for a long time. Such complexities are often embedded in the sensor electronics, and the remaining processing is done as post processing on computer systems.

Higher computing capacity and miniaturized electronics significantly enhance the sensor’s ability. This paves the way for predictive diagnostics tools that will result in tremendous benefits in terms of cost saving, higher efficiencies and reduced downtime.

Mitesh Patel heads Internet of Things for the manufacturing industry solution unit at TCS.

Tim Senkbeil, Belden: In order to compete in the world economy, systems must be highly automated and able to function with minimal human input. Diagnostic systems are critical to reducing downtime and allowing systems to run “lights out.” Today’s manufacturing environment has accelerated the development of diagnostic tools, making them a must-have. Diagnostic tools will need to be able to gather and process the vast amounts of data available from smart I/O devices, and even standard sensors, and use advanced analytics to predict component failures and set preventive maintenance schedules. These diagnostic tools will reduce unplanned downtime incidents and allow companies to maximize profits.

Tim Senkbeil is product line manager, Industrial Connectivity Division, at Belden.

Michael Howard, DMDII: Analytics have allowed large amounts of machine data to be used in a variety of diagnostic tools, from machine health monitoring to tool life management, all of which can be used to increase the efficiency of a single machine or an entire factory of machines. Data analysis allows the diagnostic tools to be much more proactive in machine maintenance and operation, rather than reactive to machine or part errors.

Michael Howard is project engineer at Digital Manufacturing and Design Innovation Institute (DMDII) in Chicago, Illinois.

Homepage image courtesy of Stuart Miles at

About the Author

Mike Bacidore | Editor in Chief

Mike Bacidore is chief editor of Control Design and has been an integral part of the Endeavor Business Media editorial team since 2007. Previously, he was editorial director at Hughes Communications and a portfolio manager of the human resources and labor law areas at Wolters Kluwer. Bacidore holds a BA from the University of Illinois and an MBA from Lake Forest Graduate School of Management. He is an award-winning columnist, earning multiple regional and national awards from the American Society of Business Publication Editors. He may be reached at [email protected] 

Sponsored Recommendations

Engineer's Guide to Advanced Motion and Mechatronics

This guide will examine the remaining differences between stepper and servo motors, new perspectives on motion control, the importance of both gentleness and accuracy with linear...

eBook: Efficient Operations: Propelling the Food Automation Market

For industrialized food production sectors, the megatrends of sustainable practices, digitalization and demand for skilled employees are underpinned by rising adaptability of ...

2024 State of Technology: Report: Sensors, Vision & Machine Safety

Manufacturing rarely takes place in a vacuum. Workers must be protected from equipment. And equipment must be protected. Sensing technology, vision systems and safety components...

Enclosure Cooling Primer

Learn more about enclosure cooling in this helpful primer.