February 2026 Volume 8

OPERATIONS & MANAGEMENT

The typical use of calipers involves an operator positioning the unit under test in their hand (or on a flat surface), then operating the calipers with their other hand. The operator uses the thumb wheel to open the jaws, places the open jaws around the feature of interest, then uses the thumb wheel again to close the jaws, contacting the relevant points on the component. When the operator stabilizes this arrangement, they read the gage’s output. Each step in this process – holding the component, positioning the gage, and operating the thumb wheel -- introduces another source of variation between operators. Compounded together, these differences form AV. Why It Matters in Forging Forging environments introduce unique measurement challenges that make accurate gaging even more critical. Scale, die wear, and dimensional changes through the part’s cooling cycle all create real variation that must be distinguished from measurement error. A reliable measurement system allows quality and engineering teams to separate true process issues from gage-related noise, an essential step in controlling any forging operation. Two Methods of Gage R&R Analysis Gage R&R Analysis is performed using one of two major methods: the Average and Range (A&R) method or the ANOVA method. Most fill-in-the-blank Excel-based Gage R&R templates rely on the simpler A&R method, an excellent starting point for quality professionals who need a validated statistical method without the strenuous learning curve. The study produces estimates for EV, AV, and the combined Total R&R by utilizing summary statistics and conversion factors called “K factors”. Each of these estimates can be expressed either as a percentage of the total variation found in the study, or a percentage of a feature’s tolerance range, depending on the application. While the A&R method offers simplicity, the ANOVA method provides more insight using the analysis of variance statistical technique to partition the total observed variation into components attributable to parts, operators, and their interactions. This allows practitioners not only to estimate EV and AV with greater accuracy, but also to identify whether significant part operator interaction effects exist, an insight the A&R method cannot provide. Regardless of the subsequent analysis method, the Gage R&R study design involves selecting the number of parts, number of operators, and number of measurements per part. For instance, a common form of the study is a “10 x 3 x 3”, which requires 10 parts × 3 operators × 3 measurements per part, for a total of 90 measurements.

From these data and the chosen analytical method, a study facilitator can then calculate an estimate of EV and AV, which can then be combined with additional statistics to answer questions like: • How much of the observed variation is coming from the measurement system itself? • Can this measurement system reliably distinguish between good parts and bad parts? • Is the measurement system suitable for use in production? For example, I recently conducted a 10x3x3 Gage R&R study using the Average and Range method to analyze the use of digital micrometers on the thickness of sheet metal samples. The data and calculations are too extensive for this article, but some of the key output statistics are reported in Table 1.

Table 1. What can we infer from these statistics?

1. Repeatability is the dominant source of gage error. Nearly all of the gage’s contribution to variation comes from the equipment itself, not from differences between operators. 2. Total Gage R&R as a percentage of Total Variation (1.60%) is excellent. This is a measure of the total error found in the study as a percentage of the total variation (part variation + error) found in the study. A common industry guideline for both this and P/T Ratio (found next) is: <10% → Acceptable 10–30% → May be acceptable depending on application >30% → Usually unacceptable 3. Total R&R as a % of Tolerance (13.60%), also known as Precision to Tolerance Ratio or P/T Ratio, is acceptable for many applications. Unlike Total Gage R&R %, P/T Ratio compares Total R&R to a specific product tolerance. Applied to different product tolerances, the same study data will result in different P/T Ratios. P/T Ratio helps answer the question: Is the reliability of this gage sufficient for a specific application?

Quick Reference to Key Terms MSA – Measurement Systems Analysis; A collection of methods used to identify and quantify sources of measurement error. Repeatability – Also known as Equipment Variation (EV); Variation when the same operator measures the same part with the same gage. Reproducibility – Also known as Appraiser Variation (AV); Variation between different operators using the same gage.

Total Gage R&R – The combined effect of EV and AV used to assess measurement capability. P/T Ratio – Precision-to-Tolerance ratio; compares measurement system error to product tolerance. Total Variation – The combination of true part variation and measurement variation.

FIA MAGAZINE | FEBRUARY 2026 37

Made with FlippingBook - Online catalogs