Skip to main content
U.S. flag

An official website of the United States government

Official websites use .gov
A .gov website belongs to an official government organization in the United States.

Secure .gov websites use HTTPS
A lock ( ) or https:// means you’ve safely connected to the .gov website. Share sensitive information only on official, secure websites.

Interlaboratory Comparisons

Published

Author(s)

William F. Guthrie

Abstract

An interlaboratory comparison for a measurement procedure is an exercise carried out by a group of laboratories to compare their performance or assess a measurement standard. Interlaboratory comparisons are typically used for one of three main purposes, to assess the random variation in measurement results across a population of laboratories, to determine the systematic differences in results among a fixed set of laboratories, or to determine the value of a physical property of an artifact or a population of artifacts. There are several design issues to consider when planning an interlaboratory study. These include selection of participating laboratories, number and stability of artifacts, time frame for the comparison, and numbers of replicate measurements made in each lab. The analysis of interlaboratory data ranges from simple, but informative, graphical analyses of the differences between laboratories to methods for constructing approximate confidence intervals for linear combinations of variance components. Probabilistic intervals for the physical property of an artifact or population of artifacts can also be obtained using maximum likelihood, generalized pivot quantities, or regression methods that handle random effects between labs and non-constant measurement variation within each lab.
Citation
Encyclopedia of Statistics in Quality and Reliability

Keywords

consensus mean, experiment design, interlaboratory comparison, measurement standard, round robin, variance components

Citation

Guthrie, W. (2007), Interlaboratory Comparisons, Encyclopedia of Statistics in Quality and Reliability (Accessed December 11, 2024)

Issues

If you have any questions about this publication or are having problems accessing it, please contact reflib@nist.gov.

Created December 9, 2007, Updated February 19, 2017