How to use digital twins to solve nonlinear control challenges for rare earth extraction

A 4-layer cyber-physical framework for REEP optimization
April 20, 2026
8 min read

Key Highlights

  • Rare earth extraction requires a move beyond traditional PID control to a digital twin framework capable of handling highly coupled, nonlinear sub-processes and significant time delays.
  • By using image processing and machine learning to analyze solution color, the framework reduces chemical component detection time from hours to minutes, enabling immediate operational adjustments.
  • The system uses case-based reasoning to formalize the intuitive knowledge of experienced operators into automated, repeatable control strategies that maintain stability under varying industrial conditions.

AVADH NAGARALAWALA, INDEPENDENT CONSULTANT

Avadh Nagaralawala, independent consultant, will present "Smarter Mining: Harnessing Automation and Control Systems for Safe, Sustainable Operations" at 3:30 pm on June 23 during A3's Automate 2026 in Chicago.

Nagaralawala will explore how automation technologies are redefining mining operations. He will share real-world examples of designing and implementing control systems that deliver efficiency and resilience in harsh industrial environments. The presentation will highlight PLC–SCADA integration, predictive monitoring and analytics, safety through automation, sustainability at the control level and the future outlook for digital transformation, Industrial IoT and cybersecurity. Attendees will gain actionable strategies for modernizing mining operations, ensuring consistency and positioning organizations to meet the evolving demands of global resource markets.


Rare earth elements sit quietly at the foundation of nearly every technology we rely on today, from electric vehicle motors and wind turbines to defense electronics and medical imaging equipment. But the processes behind extracting and purifying these elements are anything but quiet.

Having worked across mining automation and controls engineering for years, I can say with confidence that rare earth extraction is one of the most stubborn process control challenges in the industry. It is nonlinear, it carries long time delays, and its hundreds of cascade sub-processes are tightly coupled in ways that make traditional PID-only strategies fall well short of what production demands.

The question I kept coming back to was this: can we build a system that doesn't just monitor the process, but truly understands it in real time, predicts its behavior and acts on that intelligence automatically?

My work in this space has centered on a novel digital twin (DT) framework specifically designed for rare earth extraction processes, a framework where I played a central role in conceptual design, technical architecture and engineering validation. What follows is a practical account of how that framework is structured, what technologies drive it and what it has demonstrated under real industrial conditions.

 

Why rare earth extraction demands a different approach

A typical rare earth extraction process (REEP) involves a dissolution circuit, a multi-stage extraction circuit using solvent-based separation, a scrubbing section and a final precipitation and dehydration stage. Each of these stages interacts with the next. Adjust the flow rate in extraction, and you affect downstream purity. Miscalculate acid concentration in the scrubbing section, and the product quality shifts in ways that won't show up until several process steps later.

On top of that, measuring the actual component content of the solution—how much cerium, praseodymium or neodymium is present at any given stage—traditionally required lab analysis that took an hour or more. In a continuous process, that lag is operationally crippled. Operators would make decisions based on gut feel and experience, which worked, but it wasn't repeatable and it wasn't scalable.

This is where a well-designed digital twin changes the equation.

 

The four-technology framework

The DT framework integrates four core technologies that work together across a cyber-physical architecture. My contribution spans the engineering rationale behind each layer—from measurement strategy to control design and virtual inspection.

1. Soft measurement of component content: Rather than waiting for laboratory results, this approach captures the color of extracted solutions using an image processing pipeline. Rare earth solutions carry characteristic color profiles that shift predictably with component concentration—something experienced operators have long observed visually, but which had never been formalized into a reliable computational model.

The framework applies a grey-edge illumination compensation algorithm, optimized through a genetic algorithm to handle the harsh and variable lighting conditions typical of a production floor. This is followed by feature extraction across both the HSI and RGB color spaces, feeding into a weighted least squares support vector machine (WLS-SVM) that maps those color characteristics to actual component content values in the solution.

This method is particularly well-suited to industrial environments where training datasets are limited. The sensing architecture I was involved in shaping reduced component detection time from a process that previously spanned the better part of a shift down to something operators could act on within minutes. For real-time control, that difference fundamentally changes what is possible.

2. Mechanism-compensation process simulation: Static mass-balance models of extraction processes are a reasonable starting point, but they drift under real operating conditions. Feed compositions change, equipment ages, and separation behavior shifts in ways that fixed coefficients cannot capture.

A core part of my technical contribution was the engineering rationale for introducing a dynamic compensation coefficient into the standard mass and element balance equations governing each extraction stage. This coefficient is continuously updated using live production data through an improved particle swarm optimization (PSO) algorithm—one that converges more reliably than conventional implementations with a functional inertia weight and constriction factor.

The calibrated simulation covers all extraction and scrubbing stages across the full process chain. The fidelity achieved across cerium, praseodymium and neodymium predictions was validated against real industrial data and remained well within the limits that the industry considers acceptable for process decision-making. That level of accuracy is what makes closed-loop process optimization genuinely viable rather than theoretical.

3. Case-based reasoning control strategy: One of the harder problems in process control is capturing the intuitive knowledge that experienced operators carry. Engineers who have run an extraction plant for years develop a feel for what flow-rate adjustments are needed under different conditions. That knowledge is valuable, but it rarely finds its way into the control system in any structured form.

Get your subscription to Control Design’s daily newsletter.

The case-based reasoning module in this framework addresses that directly. The design I contributed to maintains a structured case library built from historical production records, each case characterized by key process parameters. When a new process state arises, a nearest-neighbor retrieval algorithm identifies the closest historical operating case and draws out the corresponding extractant and detergent flow rate presets that proved effective under similar conditions.

Those presets feed into a cascade control architecture—a fuzzy inference compensation layer working above standard PID flow loops—which adjusts flow rates dynamically in real time based on continuous soft measurement feedback.

The result is a two-layer strategy that combines the stability of proven historical practice with the responsiveness of live feedback control. This design philosophy was one I advocated for strongly throughout the project, precisely because it bridges the gap between human operational knowledge and automated system behavior.

4. Virtual workshop for remote inspection and fault management: The fourth layer of the framework is a fully interactive three-dimensional digital replica of the production floor, built to maintain live synchronization with the physical plant. Real-time data flows from field sensors through the DT data platform and into the virtual model continuously, ensuring the digital environment always reflects actual plant conditions.

One of the operational requirements I helped define was enabling plant personnel to inspect equipment status remotely, moving through the virtual environment freely, reviewing the state of every motor, pump and extraction tank without requiring physical access to the floor. The system monitors motor states using device identifiers and timestamps, detecting abnormal conditions early and issuing predictive fault warnings before minor issues become costly production stoppages.

The reduction in time required for equipment inspection was substantial, and the shift from reactive to predictive fault management represents one of the more practically significant outcomes of the framework in day-to-day operations.

 

What industrial validation showed

The framework was validated against real industrial data across the full process chain. Component content predictions for cerium, praseodymium and neodymium all remained within the bounds that industry considers reliable for production decision-making—a result that reflects the engineering decisions embedded at every layer of the framework, decisions I was directly involved in evaluating, refining and validating.

Beyond the modelling accuracy, the operational outcomes were equally telling. Inspection workflows that previously demanded significant time from engineering personnel were compressed dramatically. Component detection that had operated on a timescale of hours became something achievable within a single operational window. These are not incremental improvements, but they represent a meaningful shift in how operators interact with and manage a complex extraction process.

 

Where this is heading

The work isn't finished. Future priorities include improving soft measurement performance under varying container transparency conditions, extending the simulation engine to handle higher-stage extraction configurations and migrating the service interface to a web-based platform to make it more accessible across different operational environments. These are areas where my ongoing technical involvement is focused, alongside improving the virtual workshop's usability for operators who aren't specialists in digital systems.

The rare earth sector sits at the intersection of strategic industrial demand and genuine process engineering complexity. What this framework demonstrates is that those two things—complexity and intelligent control—are not mutually exclusive. With the right architecture, the right sensing strategy and the right integration of process knowledge into the control layer, it is entirely possible to bring a level of precision and operational visibility to rare earth extraction that the industry has historically not had access to.

That, to me, is what controls engineering is for.

About the Author

Avadh Nagaralawala

Avadh Nagaralawala

Automation and control system consultant

Avadh Nagaralawala is an independent consultant with more than a decade of hands-on experience designing and implementing automation and control systems for mining and heavy industrial operations. He has actively contributed to thought leadership platforms through webinars and industry conferences, and he’s published research, particularly focusing on innovation, electrification and sustainability in heavy industries. Nagaralawala is an IEEE senior member and Project Management Institute Arizona’s Southern Branch Director. Read his article on how to leverage PLC/SCADA and digital twins for mining operations.

Sign up for our eNewsletters
Get the latest news and updates