Assuring Measurement Accuracy with Gage R&R

March 11, 2019
2 min read
General

Variation is inherent in any system, however excessive variation in the measurement process can provide misleading analysis. This excessive variation may affect your data and appear on control charts as variation in the process, potentially skewing the results and preventing accurate analysis.

Most organizations have a system to calibrate measurement equipment routinely. Just because the measurement system is calibrated, doesn’t mean the data collected is accurate. Collecting data with measurement instruments such as calipers demands the use of repeatability and reproducibility (Gage R&R) tests. 

What is Gage R&R?

Gage R&R is a statistical tool that measures the amount of variation in the measurement system arising from the measurement device and the people taking the measurement. Repeatability is the variation found in a series of measurements that have been taken by one person using one gage to measure one characteristic. Reproducibility represents the variation in a series of measurements that have been taken by different people using the same gage to measure one characteristic of an item.

Why Complete Gage R&R Studies?

R&R studies address two significant causes of variation in measurement systems: gage variability and operator variability.  Gages may be subject to factors such as temperature or magnetic and electrical fields that can affect their accuracy. Operator variability may be caused by different interpretations of a vague operational definition or differences in background, fatigue, or even attitude of operators. Gage R&R tests are critical to assuring consistency and accuracy in the collection of data using measurement equipment.