Sunday, February 6, 2011

Accuracy vs. Precision

Posted: 01/20/2011 by Lanny Wilson in Home Page Rambling's

Accuracy

To the layman the terms accuracy and precision are interchangeable, to the Metrologist these are two distinctly different definitions. The Hardware Manufacturers use these specifications as marketing tools, and seem to prefer to keep the “public” in the gray area as to how accurate and Precise their equipment really is.
To be clear the following definitions of Accuracy and Precision can be found in Wikipedia:
In the fields of science, engineering, industry and statistics, the accuracy of a measurement system is the degree of closeness of measurements of a quantity to its actual (true) value. The precision of a measurement system, also called reproducibility or repeatablity, is the degree to which repeated measurements under unchanged conditions show the same results.
It would seem that by reading these definitions it should be fairly easy to come up with specifications for Metrology Systems that would be universal, however, a quick review of Brochures found online quickly cloud this issue. Accuracy specifications are always stated as is the Precision specification, however, it quickly becomes a “gray area” is in the methods used to create the specification; How many measurements were taken? What volume were the measurements taken to produce the specification? Some Manufacturers will clearly define their testing process, while others provide a specification without a clear definition of how the values were derived.
To further cloud the issue (is this an intentional marketing ploy??) brochures will sometimes provide specifications as  for example; “Accuracy 0.0012 inches” while another system of the same type will provide the specification as for example;  ”Accuracy +/- .0007″.  If your job is to evaluate the best system for your particular measurement task, you would quickly question is the 0.0012 inch specification Total error, Volumetric error or Maximum permissible Error, each type of error being unique. The second value provides more information however, without knowing how the tests were performed the questions still exist.
To further “muddy” the waters is the question of “What sigma is the specification?” I have seen the same company’s offer different sigma specifications for their equipment, again, this seems to be marketing. A one sigma specification is always going to “look” much better to the untrained eye, while a 2 sigma specification may look larger but as the trained Metrologist knows this value is a better specification as it is closer to “real life”. Using a one sigma specification means that 68% of the time I will achieve the stated accuracy and precision…..What about the other 32%???? Again, marketing plays a part in this when the brochure is created, what looks better on the spec sheet, that is what they prefer to use.  The technical people at those same company’s are never happy with this as they are left to explain to the customer AFTER the purchase what the specifications truly mean, this is NEVER a good conversation.
In the United States the National Institute of Standards and Traceability (NIST) assists in providing standards for the testing of our measurement equipment. Perhaps in the future with enough interest  from those in the industry one method of testing  can be developed and become the Standard, thus eliminating the simple questions; How Accurate is the Device? How Precise is the Device?
Should it really be that difficult for an experienced Dimensional Metrologist to determine which Device is going to provide the best measurement data for their measurement task?
Created by Pekaje, based on PNG version by Ant...

No comments:

Post a Comment