I always enjoy discussing new and future technology with those who create it. So, I was talking to a software developer recently, who has been creating HMI/SCADA software for more than 20 years.
“A strong trend to the future for HMI, SCADA and IoT providers is a shift from being a product provider to a services provider,” says Fabio Terezinho, P&L unit leader at InduSoft. “These are services as solutions. They are not engineering services to design the application but to provide software as a service or a subscription-based model. It will no longer be media that is installed and run on premise.”
Artificial intelligence and bots are big areas of research for future HMI solutions. “These can take advantage of existing infrastructure provided by someone else, such as big data," says Terezinho. “These will be machine learning engines—interfaces that collect the data and provide results and information to the user even if the engineer didn’t predefine those conditions. These are key differentiators we see with HMI software in the future.”
Terezinho’s company has been testing bots in real applications. “A bot is like what happens behind the scenes when you ask Alexa from Amazon, ‘How’s the weather today?’ or when doing a Google search,” says Terezinho. “The engine that is receiving your input, analyzing the data with different algorithms and providing interesting results for you is what is called a bot.”
For example, Terezinho’s company created a simple proof-of-concept HMI bot application. When the software recognized the tablet was near a machine, it would “bot” information to the top of the tablet, such as temperature, production data and alarms.
Voice recognition was also enabled for this application. The user can ask the HMI application, for example, “What are the five most critical alarms?”
“As a first step, the HMI software would convert the voice to text,” says Terezinho. “The second step—the interesting one—is the text is then sent to a bot that is preconfigured on the cloud. The bot then delivers information back that was automatically filtered to include the five most critical alarms on the HMI.”
This is just the beginning, says Terezinho. “In the future, the goal is to have the user either talk, click a button or write a question and get the information requested without the need for a very strict syntax,” he says. “There will be no need to write specific program code. Just ask a question. Instead of programming and configuring graphics to display yesterday’s production for Machine A, the user will simply ask, ‘Give me yesterday’s production for Machine A.’”
Or perhaps a simple syntax such as “select production data from Table 1 where time equals yesterday” will be used. "It’s a more user-friendly, human way to interact with the system to get the data needed," says Terezinho. “The databases that the bots are mining to collect, analyze and deliver the information to the user is traditionally what is called big data.”
HMIs can do a lot now. They can collect data from many dissimilar items, filter it, push it to the cloud where analytics can happen and send usable results back to the user. They are providing actionable data more quickly than ever before. It’s not just one thing; it’s all of these things, along with augmented reality and bots converging into one solution where you can combine all these options to create solutions that are driven to the new generation of operators that are out there.
“Five years ago or so, when we added support for gestures and multi-touch, many of our customers came to us and wondered what’s the value with that,” says Terezinho. “Why swipe a screen when we can just click a button to go to another screen? You could argue that clicking a button is faster. However, the point is that the users coming to the market now, for them, it may be more intuitive, or they may expect functionality, that when they swipe the screen, rather than pushing a button, the HMI changes to the next screen.”
The HMI will be interfacing with many apps. “The HMI will keep the core strategy and vision, but you adapt that to technology in general, as it evolves,” says Terezinho. “As expectations in the market evolve to a way that, if you compare the new solution to what was available 20 years ago, the core concepts are the same, but the way they look are completely different things.”
Terezinho has a slide from 20 years ago that sets the vision. It was basically data communication, manipulation and presentation. The slide used today has exactly the same concepts—read data, manipulate data and present it—but the graphical interface of the slides is completely different. Twenty years ago, it was a modern for the time—desktop PC running a DOS version of Window, 3.11. That’s it. Today, it’s the same slide on data presentation but with tablets, smart phones, Microsoft Halo, Google Glass and Apple Watch added. With data presentation, we are still consuming information, but the way information is presented—how we interact with it—is completely different.
“It is not about adding functionality you didn’t have before,” says Terezinho. “It’s about adding functionality that is more efficient to whoever is using the system. Users tomorrow will just want to enter a quick search string to find the information they need when they need it. Both ways collect and display the same information. One way is just simpler and more efficient to the future work force than the other.”