There are a number of parameters used to characterize a measurement result for the purposes of specifying its value for the intended purpose. Precision (variability) and accuracy (correctness) are two of the more often used parameters and, like many other characterizing parameters, they often do not have the same meaning to the supplier, buyer, and ultimate user of dimensional metrology instruments used in semiconductor processing. These differences in meaning often arise because human common sense about measurements of ordinary-sized objects is often misleading when applied to submicrometer-sized objects. For example, there is no universal ruler that can be used to measure the size of all submicrometer objects because dissimilarities between the ruler and the object will cause them to be viewed differently by light, electron beams, and mechanical probes. However, the basic concepts behind characterization of measurement results can be carried over from ordinary-sized objects to submicrometer-sized objects if the differences in the metrology applicable to these different dimensional regimes are taken into account. This paper summarizes the generally accepted generic metrological meaning(s) and significance of the more commonly used parameters to characterize measurement results for the purpose of clarifying any misunderstanding that might otherwise occur between the metrologist and the user of metrological data in the regime of submicrometer-sized features.
Citation: Handbook of Critical Dimension Metrology and Process Control CR52 by Kevin M. Monahan, Editor
Publisher Info: SPIE-International Society for Optical Engineering, Bellingham, WA
Pub Type: Books
accuracy, calibration, CD measurements, linearity, linewidth, metrology, modeling, resolution, sensitivity, standards, submicrometer, traceability, uncertainty