By Ned Lecky
Accuracy. Precision. GR&R. Two words and an acronym that strike fear into the hearts of many a manufacturing engineer or control system designer. The terms and principles are extremely well-understood by most QA and process control specialists, but what about the rest of us mortals trying to wade through statistics without getting bogged down in too many details? To start with, we have some object that we wish to measure. We could be measuring weight, length, color, shape—the type doesn't matter. The measurement device we need—a scale or a ruler or a colorimeter—is called a gauge.
The accuracy of a measurement is the degree to which it is close to the actual, true measurement. The true measurement is taken, if you will, by some sort of perfect measurement system or a measurement system that is calibrated and tested to some extremely trustworthy degree. The accuracy of our on-the-floor measurement would be the difference between that perfect measurement and the one we get. If we have a block measured and certified to really be exactly 1 in. and our in-process gauge reads 1.053 in., this one single measurement is accurate to within 0.053 in.
That makes sense. Precision could be a little harder to grasp. Precision is a measure of how easy it is to repeat a measurement taken with a gauge. In fact, repeatability is another name for precision. What if I measure the block with the same gauge three more times and get: 1.051, 1.052 and 1.054? Well, this is a relatively precise gauge. Our measurements only vary over a range of 0.004 in., and I might be tempted to say that the precision of the gauge is something like ±0.002 in. In practice, we would use statistical methods to analyze all of these readings and use a statistical tool such as standard deviation to determine that, say, 90% of our readings would range over some particular window.
What if my three measurements were: 1.041, 1.101, 0.987. We see gauges like this in real factories all the time. Maybe it's an old micrometer and the lines are sort of worn off here and there. Maybe there is dirt or grit in the gauge. Our measurement technician does his best, but it should be no surprise that if we measure the block over and over again, the readings will vary considerably—in this case, over a range of more than 0.1 in. Clearly, a much less precise gauge.
Loss of precision can come from gauge inadequacies or from operator training. Differentiating between the two is important and commonly is accomplished using analysis of variance gauge repeatability & reproducibility (ANOVA GR&R), often just called GR&R. A precise statistical technique (ANOVA) is used to analyze multiple measurements made by multiple operators using the same gauge. This not only determines the precision of the gauge, but it also analyzes how the precision and accuracy vary from operator to operator. Each gauge most likely has some tricks to its use and requires a certain amount of training, patience and practice to use well. The GR&R study provides a way to determine not only how good the gauge is, but how consistent each operator is in using the gauge and what each operator's skill level is.
GR&Rs are quite easy to do, contrary to factory myth. Just five or 10 measurements with three operators or so is usually enough to give great insight into the quality of the gauge and the consistency of operator training. A more detailed study with 20 measurements using, say, 10 operators, is often as complicated as one needs to get to differentiate between accuracy, repeatability and operator training. The GR&R provides a hugely important data point for interpreting measurements coming from the factory floor.
In today's automated measurement world, are GR&Rs obsolete? Absolutely not. In my industry, machine vision, we always use GR&R studies to evaluate our calibration (accuracy), the quality of our image acquisition and analysis algorithms (which affect precision) and the ability to run the same parts through different inspection stations and get the same measurements. Automated GR&Rs are quite easy to design and perform.
Ned Lecky is an ME and EE with 25 years of experience in control systems and machine vision. As owner of Lecky Integration (www.lecky.com), he consults for OEMs, system integrators and machine vision providers.