“Edge/cloud strategies are a very important factor to design and account for on the front end of a project,” explains Sam Hoff, chief executive officer, Patti Engineering, a Control System Integrators Association (CSIA)-certified system integrator in Auburn Hills, Michigan. “It is important to decide what information you want to process at the edge and what information you want to analyze in the cloud (Figure 1).”
Data is king, and quick access for real-time decision-making is crucial, notes Jeff Sanders, B.S.M.E., system integration manager at George T. Hall, also a CSIA-certified member, headquartered in Anaheim, California. "We have recently seen customers sparing no expense in order to access all the data possible with the process; we’ve seen a push to more edge computing for real-time decision-making and to reduce decision-making in the cloud," he adds.
"I have seen a much larger push to collect the smaller bits of data, meaning more data from small skid systems or vendor-supplied equipment," explains Heath Stephens, PE, digitalization leader at Hargrove Controls & Automation, a CSIA-certified member based in Mobile, Alabama. "Whereas previously these systems may have interfaced with a larger control system only via a couple of hardwired start/stop and alarm signals, now Ethernet communication is much more the norm. There is also an increased use of industrial wireless for monitoring and control. Clients are increasingly comfortable with remote vendor access to systems through cellular gateway devices."
How much is too much?
DigiKey is seeing a massive shift in data collection and transmission to secure edge and cloud computing devices, says Eric Halvorson, partnership marketing manager—strategic programs. “Intelligent machinery is capable of collecting and analyzing enormous amounts of data points while maintaining high levels of cybersecurity,” he explains. Growth in new technologies such as single-pair Ethernet (SPE) allows devices to communicate quickly across short distances in situations where noise is a factor, all while reducing the amount of copper required by 50%, notes Halvorson.
“Data collection and transmission seems to be something everyone wants, but few know how to use well,” says Larry Stepniak, electrical engineer at Flint Group. “Process variables can be collected and logged quite easily from newer machines. Older machines with less modern types of communication are more difficult. With multiple machines and multiple variables, this can quickly lead to data storage and/or network bandwidth issues. Care has to be taken to avoid collecting useless data points or having sampling frequencies that are too high.”
These various logs are often stored in equally various formats, such as comma-separated variables (CSV), database file (.dbf) or text file (.txt) and are not always readable by a human in their native forms, notes Stepniak. “This leads to data manipulation and report generation that can hopefully be automated but still require someone to perform or monitor these tasks,” he says.
All of these points may sound gloomy, but the future definitely looks brighter, Stepniak explains. “Edge computing is taking some of the burden off of the machine devices,” he says. “Vendors have begun to address methods to make this data more useable. Systems that allow the user to specify which data points they require and allow them to build custom dashboards for viewing are becoming more available and easier to use (Figure 2). Cloud devices are alleviating storage issues. For older equipment there are several companies making interface devices to speak to modern collection systems.” All of these developments seem to be trending in the right direction for good usable data, notes Stepniak.
Changing data landscape
Data collection and transmission are still very top-down, however, notes Jason Andersen, vice president, strategy and business line management, at Stratus. "At the application level there is a lot of data science occurring, so teams need to understand supply chains or optimizing product flows," he explains. "But the machine-level factor is still being deployed very tactically. You hear all these great stories of savings, which are true, but those are almost always a result of something being broken or that cannot be easily diagnosed (Figure 4)." As a result, Andersen says, edge-based analytics aren't being used strategically yet.
Due to this trend, an Internet-of-Things (IoT) gateway product with edge artificial intelligence (AI) for video data could be necessary, says Yang.
"There has been a lot of movement toward data collection and transmission," says Mark Steffens, CEO of Airline Hydraulics. "About two dozen customers are interested in this world." Yet the biggest issue Airline Hydraulics has experienced is integration simplicity and commitment at the customer level to have the dedicated business intelligence (BI) and overall equipment effectiveness (OEE) tools available to their employees, he explains.
In addition, the need for predictive and preventive tools will grow with fewer technicians graduating from school, cautions Steffens. "Sadly, I believe companies in the United States are slow to adopt these technologies because we are so short-term versus long-term real-profit focused," he says.
"We hired three full-stack developers three years ago and added another two in recent years, working to improve our internal processes and provide application programming interfaces (APIs). Airline will continue to invest in this area, and this will become an increasingly recognized resource for our customer base," Steffens promises.
“There is a high demand for data collection and transmission in many industrial sectors,” confirms Cory Engel, application engineer at Automation24, “and this will likely only continue to increase.” Using process data for on-site enterprise level system analysis and data trending is already quite common, and maintaining redundant records of key process data is even a regulatory requirement in some industries, he says.
Secure virtual-private-network (VPN) cloud-based systems are increasing in popularity, notes Engel. “The advantage for these systems is the ability to securely remote-in to your factory control system from anywhere in the world with an internet connection,” he explains. “Once connected, you can perform actions ranging anywhere from monitoring your process status to modifying programmable-logic-controller (PLC) code.”
These cloud-based systems can also be set up to provide automatic notifications and alarms to a device such as a cell phone when certain user-defined thresholds are exceeded, adds Engel.
The year 2022 seemed to be the year of talking about digital transformation, points out Freeman Smith, founder of Nufactur. “Hopefully, 2023 will be the year of acting on it,” he says optimistically. “It seems so far that digital transformation is something most companies recognize is important, but something they are wholly unprepared to carry out (Figure 5). The digital transformation is happening at the edge first, with IO devices collecting data that can feed back into existing control systems. An increase in sales of sensors and IO-Link Master seems to be the only noticeable changes.”
Smith predicts we will know that the digital transformation has truly begun once companies are purchasing message-queuing-telemetry-transport (MQTT) brokers to transmit big data from the factory floor to their cloud-based systems.
“Sterling Systems & Controls’ systems have always been based around collection and management of data, so we have not seen a lot of change,” explains Mike Drew, engineering manager at Sterling, which specializes in batch-processing systems. “All end users are increasingly interested in securing their data, so we see constant changes in how they want to handle remote access and security, which makes for challenges in maintaining standards and training our internal staff for providing remote support.”