What does your sensing application require?

Because accuracy and precision come at a price, you need to understand just how accurate and how precise a system needs to be.

By Mike Bacidore, chief editor

1 of 2 < 1 | 2 View on one page

Many sensing applications require both accuracy and precision. The trick is to know when one or the other takes precedence.

“Accuracy is the closeness of the measured quantity to the true answer,” explains Wade Mattar, flow product manager, Schneider Electric. “And precision is the closeness of repeated measurements to each other (Figure 1). Custody transfer, filling operation and batch operations are a few examples that require a combination of both accuracy and precision, or reliability.” But many applications are out there.

For example, although a robot may have a very good repeatability at +/-0.020 mm, it has difficulty replacing a CNC machine due to the robot’s poor accuracy in this space, says David Perkon, vice president of advanced technology, AeroSpec. “A multi-axis CNC machine starting at a datum, or zero point, and moving 600 mm in any direction will move that exact distance within a tight tolerance,” he says. “A robot or multi-axis gantry on the other hand, moving 600 mm in any direction will have much higher variation in the final position due to poorer accuracy, but it will precisely move to the same inaccurate position.”

Combining accuracy and precision around a taught robot position or within a limited range of motion is common and is a good practice if accuracy is suspect, Perkon explains. “Where poor accuracy starts to show is in large spaces where the desired positions are calculated from a single starting position,” he says. “The error adds up. However, teaching multiple points and taking advantage of the robot’s or motions system’s repeatability often helps.”

Applications that require both accuracy and precision include part dimension and tolerance measures, part positioning and other metrology tasks, explains Ben Dawson, director of strategic development, Teledyne Dalsa. “Machine vision tasks that use edge-based measurements lend themselves to this combination because we can measure edge position to a fraction of a pixel and have calibration methods that reliably translate pixels into standard measures,” he says. “On the other hand, some types of applications, such as verifying that a part is present or detecting surface defects, generally do not need high accuracy and precision; you just want to know if the part is there or undamaged.”

If an application needed to move a robot arm to the same place repeatedly for an assembly process, for instance, high precision may be desirable, says Matt Hankinson, senior technical marketing manager, MTS Sensors. “Typically, the linear placement is adjusted after the mechanical setup of the system to correct any offsets, so accuracy, or true distance traveled, isn’t required,” he explains. “If an application is not going through an initial adjustment and it’s critical to travel a known distance without any offset, then high accuracy may also be required from the sensor. There is also an ISO 5725 standard with all the details.”

Imagine using a linear scale inscribed on a metal bar to measure a length, offers Peter Thorne, director of the research analyst and consulting group at Cambashi. “Perhaps repeated readings give results all within 0.1%; this is a measure of the precision,” he explains. “If the scale had suffered an impact that compressed the metal bar, every reading might be 1% different from the true value. When using sophisticated measurement devices, there can be many possible sources of these systematic errors.”

Also Read: How to Hone Your Sensing Applications

The development of a production process can define calibration procedures to achieve required accuracy, as well as tooling or sensor specifications, setup and testing to achieve required precision, explains Thorne.

“Statistical methods then provide an effective way of handling the variations found during production,” he says. “It may be possible to identify trends in readings and predict when they will fall outside of specification, enabling some preventive action before this happens. The pattern of readings may be enough to identify likely causes. For example, electrical current consumption on start-up can identify wear and tear of motor bearings. There is a trade-off between extensive measurement—for best prediction and determination of problems—and the time and cost of measurements—fewer sensors and readings generally make a process step faster and lower cost. Good specifications help handle these trade-offs.”

1 of 2 < 1 | 2 View on one page
Show Comments
Hide Comments

Join the discussion

We welcome your thoughtful comments.
All comments will display your user name.

Want to participate in the discussion?

Register for free

Log in for complete access.


  • accuracy="the fact" of being exact or correct. precision="the quality (how good or bad something is)" of being exact. Source:http://dictionary.cambridge.org


  • If accuracy is the "fact" of being exact or correct (http://dictionary.cambridge.org); the figure 2 "Accurate not precise" is not possible unless the defined value of accuracy is all the arrow target, in other words, the tolerance of the defined value of accuracy is all the arrow target. If the tolerance is that big, the point might be in any place and to be exact or accurate. So the unique accurate value is the bull’s-eye from the arrow target figure.


  • I agree, I was confused by the "accurate, not precise". I can't speak to its accuracy (no pun intended) but I've seen many explanations in many places and I've never see one that looks like that.


RSS feed for comments on this page | RSS feed for all comments