CD1404HowtoGetHighPrecisionAccuracy

How to Get High Precision and Accuracy for OEMs and System Integrators

April 3, 2014
Ken Brey Says Accuracy is the Degree to Which a Measurement Agrees with an Established Reference Value, and Precision is the Degree to Which a Measurement System's Results Agree With its Own Repeated Measurements
About the Author

Dan Hebert is senior technical editor for Control, Control Design and Industrial Networking. 

Machine builders, robot builders, test system OEMs and the system integrators that serve them frequently deal with precision and accuracy, and they well understand that these two elements of measurement are related, but not the same.

"Accuracy is the degree to which a measurement agrees with an established reference value, and precision is the degree to which a measurement system's results agree with its own repeated measurements,” explains Ken Brey, technical director at control system integrator DMC in Chicago.

Recently, DMC's project manager, Jesse Batsche, needed an isolated voltage source accurate enough to simulate the battery cell outputs of an automated battery management system (BMS) tester. Accuracy was a crucial specification in this case, as the voltage outputs needed to be adjustable from 2 V to 7 V, and required an accuracy of ±2 mV. Batsche couldn't find a commercially available voltage source that could meet the output requirements of programmatic adjustability, high isolation, relatively high current and extreme accuracy.

The BMS test system already had accurate digital multimeters and a complex network of controlled relays, which allowed measurement of current or voltage at any node. Batsche used these tools to his advantage, implementing a control loop that measured the real voltage output. This measurement was then used to adjust the command voltage accordingly until the output was at the desired level. The low precision of the voltage source was overcome by using a high-precision, high-accuracy voltmeter in a feedback loop.

On another occasion, Brey was called late into a project cycle to help fix an existing automated test system responsible for measuring compression force on part samples of compliant material. Force was measured using a piezoelectric transducer known for linearity over a wide range.

Read more: Linear Motion Moves Up and Out

Brey got the most he possibly could out of the test system, delivering accuracy that matched the requirements, but the existing system ultimately did not meet the customer's requirements for precision. The test system used a PLC, and its scan time allowed for just one captured measurement, while the force measurement was stable. Electrical noise had a big negative impact on the single-data-point measurement, and the project was eventually scrapped.

A few years later, the same client was ready to try again, and this time was willing to invest as required to meet precision goals. Armed with previous experience, Brey's approach was to eliminate the weak links of the system by applying proven elements, along with hardware and technologies he trusted to give stable results over time.

Brey selected high-resolution DAQ hardware, a National Instruments, four-channel, 100-kS/s/ch, 16-bit, ±10-V analog input module with a higher sampling rate and lower temperature drift than the PLC. The modified measurement system was able to over-sample, acquiring thousands of readings the moment the part was stable. Averaging the measurements reduced the high-frequency electrical noise. Once calibrated, the new test system's measurements were precise and stayed in alignment over time.

Measurement systems must continue to improve if they're to meet ever-increasing customer demands for high-quality products verified by testing. A guideline to strive for in many industrial measurement applications is a ratio of 10 to 1 in allowable processes variation and precision of the measurement.

Although every measurement system has its own unique set of issues, these techniques should be considered to get the most out of your hardware:

  • Use differential voltage inputs, with the noise shared on both lines subtracted out.
  • Move the measured signal into the noise-immune digital world by making analog-to-digital conversions as early as possible.
  • Use current-based instead of voltage-based transducers when distances can't be minimized.
  • Over-sample measurements and average the results to reduce high-frequency noise.
  • Use equipment that has low temperature drift, even in controlled environments.
  • Carefully ground the shields of cables carrying analog signals, typically on just one end.
  • Avoid ground loops.

Making measurements with high accuracy and precision often necessitates wringing every last bit of performance from the measurement hardware and software. Employing the above tips and tricks can often help to bring the measurement system into compliance.

About the Author

Dan Hebert | PE

Dan Hebert is a contributing editor for Control and Control Design.

Sponsored Recommendations

2024 State of Technology: Report: Sensors, Vision & Machine Safety

Manufacturing rarely takes place in a vacuum. Workers must be protected from equipment. And equipment must be protected. Sensing technology, vision systems and safety components...

Enclosure Cooling Primer

Learn more about enclosure cooling in this helpful primer.

Ultra-fast, ultra-accurate linear indexing

NSK integrates advanced automation and drive technologies to deliver high capacity, high speed, ultra-precise indexing and positioning in a compact, flexible linear actuator: ...

Non-Metallic Enclosures Compared to Metallic Enclosures

What you want from your enclosure is long-term, productive service. Knowing your application, enclosure materials and the environment in which it will be located will help.