Force Calibrations using Errors-in-Variables Regression and Monte Carlo Uncertainty Evaluations
Thomas W. Bartel, Sara Stoudt, Antonio M. Possolo
An errors-in-variables regression method is presented as an alternative to the ordinary least-squares regression computation currently employed for determining the calibration function for force measuring instruments from data acquired during calibration. A corresponding uncertainty calculation for the errors-in-variables regression, based on a Monte Carlo method, is also presented. The corresponding function (which we call measurement function) necessary for the subsequent use of the calibrated device to measure force, and the associated uncertainty evaluation, also are derived from the calibration results. Comparisons are made, using real force calibration data, between the results from the errors-in-variables and ordinary least-squares analyses, as well as between the Monte Carlo uncertainty assessment and the conventional uncertainty propagation employed at NIST. The results show that the errors-in-variables analysis properly accounts for the uncertainty in the applied calibrated forces, and the Monte Carlo method yields a better representation of the calibration uncertainty throughout the transducer's force range than the methods currently in use. These improvements notwithstanding, the differences between the results produced by the current and by the proposed new methods generally are small, and there will be no compelling need to revise any of the force calibration certificates previously issued by NIST.
, Stoudt, S.
and Possolo, A.
Force Calibrations using Errors-in-Variables Regression and Monte Carlo Uncertainty Evaluations, Metrologia, [online], https://tsapps.nist.gov/publication/get_pdf.cfm?pub_id=919758
(Accessed December 9, 2023)