How edge controllers balance closed core for deterministic control with an open environment for containerized applications

Beyond the 20-year machine: Learn to manage Linux LTS, Podman and eMMC wear-out
April 21, 2026
8 min read

Key Highlights

  • Effective cybersecurity for edge devices relies on a multi-layered approach that anchors software-level encryption (TLS/SSL) to physical hardware security like TPM 2.0 and secure boot to prevent spoofing and unauthorized access.
  • Unlike traditional PLCs, edge controllers utilize high-performance processors that require strict adherence to thermal design limits and environmental certifications to prevent performance degradation and premature hardware failure.
  • To bridge the gap between 20-year machine lifecycles and rapid software evolution, engineers should use a closed core for deterministic control while maintaining an open environment for containerized applications and frequent security patching.

Mark Sekulich is business development lead—semiconductor at Yokogawa.

For security, should the edge device support hardware-accelerated encryption, such as TPM 2.0, for TLS/SSL certificates when pushing data to the cloud or an on-premise MQTT broker?

Mark Sekulich, business development lead—semiconductor, Yokogawa: For secure communication with cloud or on-premises MQTT brokers, it is strongly recommended to use edge devices that not only encrypt communications with TLS/SSL, but also protect certificate private keys with hardware-based security such as TPM 2.0.

What cybersecurity mechanisms, such as secure boot, certificate management, encrypted communications and role-based access control, should be implemented?

Mark Sekulich, business development lead—semiconductor, Yokogawa: Edge devices should implement a multi-layered cybersecurity framework that combines secure boot to ensure only authorized software can run, TPM-based certificate and key management to prevent spoofing, TLS to protect the confidentiality and integrity of communications and RBAC to manage user access and roles.

Edge controllers often run much hotter than traditional PLCs because of their high-performance processors. Why is it important to know thermal limits?

Mark Sekulich, business development lead—semiconductor, Yokogawa: Edge controllers offer greater flexibility and processing power through the use of high-performance processors, but their thermal behavior can vary significantly depending on operating conditions. Running them without understanding their thermal limits can gradually lead to reduced performance, degraded control quality and a shorter component lifespan. For this reason, it’s important to clearly understand the thermal design limits and consider them when planning system design, installation location and operating load conditions.

What environmental and industrial certifications—temperature range, vibration resistance, IP rating, UL/CE compliance—does the hardware need for on-machine deployment on factory floors? What about inside machines?

Mark Sekulich, business development lead—semiconductor, Yokogawa: Edge controllers used in factory equipment must meet requirements for operating temperature range, vibration and shock resistance and compliance with electrical safety and EMC standards. When installed inside machinery, it is especially important to consider the effective temperature, including self-heating, and to design with a clear understanding of thermal limits and their impact on component lifespan. Environmental conditions and applicable standards should therefore be selected based on the installation location, whether inside a control cabinet or directly within the machine.

Machines can live for 20 years or longer, but software moves much faster. A clear update path is essential to keep the machine from becoming a security liability. What should be the guaranteed long-term support (LTS) window for a Linux kernel and security patches?

Mark Sekulich, business development lead—semiconductor, Yokogawa: While industrial machines are often used for more than 20 years, the lifecycle of Linux and modern security technologies is much shorter. Because of this, it is not realistic to design a system that relies on a single OS version for two decades.

Edge controllers therefore need the ability to apply Linux kernel and security updates as needed. This approach allows machines to operate securely over the long term while preserving existing control application assets.

The edge controller supports deterministic control based on precise timing and data consistency, which helps minimize the impact of OS updates on control performance.

What containerization or virtualization technologies, such as Docker or Kubernetes-based frameworks, are supported for deploying applications at the edge? How does the operating system support standard Docker runtimes, and how is the persistent storage handled to prevent SD card or EMMC wear-out from frequent log writes?

Mark Sekulich, business development lead—semiconductor, Yokogawa: The edge controller can use Podman, a Docker-compatible container technology. To minimize the impact on control tasks, it is designed to run efficiently in coordination with the system, enabling lightweight operation. In addition, to reduce wear on SD/eMMC storage, persistent data can be clearly separated and managed independently.

What are the advantages or disadvantages of an open or closed edge environment?

Mark Sekulich, business development lead—semiconductor, Yokogawa: Open edge environments can take advantage of Linux and open-source software to stay current with technology advances and security updates. However, without a well-defined update policy, they can introduce operational risks.

Closed edge environments provide greater stability, but they can struggle with long-term technology obsolescence.

This can be addressed by maintaining a closed core for control while keeping higher-level applications and data processing open. This approach balances the long lifecycle of industrial machinery with the fast pace of modern software evolution.

How do compute resources, such as CPU architecture, cores, RAM or storage, affect the ability to run analytics, vision or AI workloads locally?

Mark Sekulich, business development lead—semiconductor, Yokogawa: The resources required and potential bottlenecks can vary widely depending on the type of AI being used and how it is implemented. For example, AI algorithms designed for parallel processing depend heavily on the number of CPU cores available. In contrast, applications like image recognition or video analysis, which typically rely on GPUs, are largely influenced by GPU performance.

That said, simply choosing a high-performance computer is not always the best approach. Factors such as power consumption, cost and space constraints must also be considered. When integrating devices into manufacturing equipment or other end products, long-term availability and stable operation are equally important.

To make the right choice, it’s essential to understand the characteristics of the AI application, evaluate processing time and real-time requirements and select an edge controller that best fits those needs.

Get your subscription to Control Design’s daily newsletter.

How can supported methods for remote management, firmware updates and device provisioning across large fleets of edge controllers be implemented?

Mark Sekulich, business development lead—semiconductor, Yokogawa: This can be accomplished using a dedicated configuration tool. The configuration tool enables remote firmware updates and simplifies device provisioning.

How can edge controllers integrate with existing SCADA, MES or cloud platforms for data exchange and system orchestration?

Mark Sekulich, business development lead—semiconductor, Yokogawa: Edge controllers can integrate with existing SCADA, MES and cloud platforms using standard protocols such as OPC UA and MQTT.

I/O and communication data are consolidated into a data stream that separates control logic from higher-level integration. This design enables data exchange and orchestration with upstream systems without requiring changes to the control program.

What is the maximum number of concurrent industrial protocol tags that the internal OPC UA server can bridge to the edge side?

Mark Sekulich, business development lead—semiconductor, Yokogawa: The specifications do not define a fixed maximum number of industrial protocol tags that the OPC UA server can bridge. The actual number of tags that can be connected simultaneously depends on the overall system configuration, including tag update rates, event frequency, CPU and memory usage and OPC UA subscription settings.

Because of this, it is more important to design the system based on the purpose of the data and its update characteristics, rather than focusing only on the total number of tags.

How important is it to know the bandwidth of the internal data bus that moves variables from the fieldbus to the high-level applications?

Mark Sekulich, business development lead—semiconductor, Yokogawa: When transferring variables from fieldbus networks to higher-level applications, it’s important to understand the bandwidth of the internal data bus, especially in applications that handle high-speed or high-density data. Because the internal bus is shared for control, data acquisition and upstream communication, transferring large numbers of variables at high rates can impact overall system performance and available headroom.

For this reason, applications should be designed with consideration not only for the number of variables, but also for their data size and update frequency.

Tell us about one of your company’s state-of-the-art product, if any, that involves edge computing.

Mark Sekulich, business development lead—semiconductor, Yokogawa: The A8 controller, released in North America in January, is a Linux-based, Ubuntu 24.04, edge controller that uses real-time patches to deliver fast, reliable performance (Figure 1).

The A8 is an industrial edge controller designed for installation inside control cabinets and meets requirements for temperature, vibration, EMC and electrical safety. It complies with IEC 61131-2, the basic standard for PLCs and control devices, and meets CE, UL and CSA regulations. The controller has an IP20-equivalent rating and is designed to operate within a temperature range of 0–55 °C, evaluated as the ambient temperature inside the control cabinet.

The A8 is a controller built on real-time Linux—PREEMPT_RT—that enables deterministic control through precise timing and consistent data handling.

I/O and communication data are updated deterministically by the system, independent of the application’s execution cycle. Each data point is assigned a high-precision timestamp, clearly defining when it was acquired or when an output occurred. Time can also be shared across multiple controllers, enabling coordinated and deterministic control across distributed systems.

At the same time, the A8 supports traditional embedded controller methods such as fixed-cycle control and real-time interrupts, allowing it to be used in familiar ways for existing control applications.

About the Author

Mike Bacidore

Editor in Chief

Mike Bacidore is chief editor of Control Design and has been an integral part of the Endeavor Business Media editorial team since 2007. Previously, he was editorial director at Hughes Communications and a portfolio manager of the human resources and labor law areas at Wolters Kluwer. Bacidore holds a BA from the University of Illinois and an MBA from Lake Forest Graduate School of Management. He is an award-winning columnist, earning multiple regional and national awards from the American Society of Business Publication Editors. He may be reached at [email protected] 

Sign up for our eNewsletters
Get the latest news and updates