The activities described here support the application of high-dose, high-energy ionizing radiation for a variety of industrial processes. Radiation-induced materials modifications improve the properties of plastic films and packaging, as well as the protective insulation on wire and cables. Inks on commercial packaging can be cured through a pollution-free process that avoids the use of volatile organic compounds (VOCs). Perhaps the largest application is the sterilization of disposable, single-use medical products (e.g., syringes, bandages, sutures); more than 50 % of such products are sterilized with radiation. A related application is the irradiation of blood to prevent Graft-Versus-Host-Disease in transfusions. Food irradiation to reduce pathogens (e-coli, salmonella, listeria, etc.) is a pasteurization process that is growing in public acceptance to deal with the risks of food-borne illnesses. Irradiation is used also in agricultural pest control.
The high-dose dosimetry program supports radiation-processing applications by assuring that the absorbed dose to the product, often prescribed or limited by regulatory agencies, is traceable to NIST standards. In addition, our most accurate measurements using small alanine-pellet dosimeters for these high-dose processes show promise to provide traceability to national measurement standards in clinical applications for the small-field radiation beams increasingly being used in radiation therapy.High-dose dosimetry service descriptions can be found in Procedures 11 and 12 of the RBPD Quality System.
NIST developed the alanine dosimetry system in the early 1990’s to replace radiochromic dye film dosimeters. Later in the decade the alanine system was firmly established as a transfer service for high-dose radiation dosimetry and an integral part of the internal calibration scheme supporting these services.
Detailed descriptions of the NIST high-dose irradiation facilities and the alanine dosimetry system can be found in the Accomplishments section of this web page below. Service descriptions and price schedule for NIST high-dose services are found in this link.
The intention of the following topics and supporting information is that it be used as a resource for users of these high dose services. This document provides additional information that is either newly developed or unpublished but pertinent to the application or interpretation of service information (e.g., NIST certificates).
The influence of absorbed dose rate:
Check standards are used by the NIST Ionizing Radiation Division to monitor the performance of the alanine dosimetry system that is central to its high-dose transfer dosimetry service. These measurements are performed to confirm the operational readiness of the calibration curve. Deviations from the expected check standard values can result from a wide range of sources that include manufacturing abnormalities in a dosimeter and spectrometer-related changes. A few years ago, check-standard measurement deviations unveiled a previously unknown rate effect for the alanine dosimetry system (Desrosiers et al., 2008). This rate-effect study characterized a complex relation between the radiation chemistry of crystalline alanine and the applied dose rate that was also dependent on the absorbed dose. That the rate effect only becomes significant above 5 kGy likely contributed to it only recently being discovered despite decades of research in alanine dosimetry. It was learned that the effect is intrinsic to alanine and is not dependent on the chemical form or manufacturing formulation of the alanine dosimeter. The study postulated that the production of one (or more) of the radiation-induced alanine radicals is dependent on the dose rate.
A follow-up study (Desrosiers and Puhl, 2009) aimed to investigate the influence of irradiation temperature on the dose rate effect. No increase in the effect was found with increasing temperature, but the dose rate effect appeared to be nonexistent at irradiation temperatures of -10 °C and -40 °C.
In summary, it is known that:
Temperature coefficient studies:
The response of high-dose-range chemical dosimeters is dependent on the dosimeter temperature during irradiation. Typically, irradiation temperatures are estimated by measurements, calculations, or some combination of the two. Then using the temperature coefficient for the dosimetry system, the dosimeter response is adjusted or corrected to be consistent with the irradiation temperature for the calibration curve. Consequently, the estimation of irradiation temperature and the response correction via the temperature coefficient are sources of uncertainty in industrial dosimetry.Studies to characterize the temperature coefficient for irradiation of dosimeters below ambient temperatures was completed several years ago (Desrosiers et al., 2004) on the only commercial dosimeter available at that time. A study of the influence of irradiation temperature on modern commercial alanine dosimeters is now complete (Desrosiers et al., 2012).
Conversion of absorbed-dose(water) to absorbed-dose(silicon):
Upon request from customers ordering NIST calibration of dosimeters for radiation hardness studies, NIST will convert its default absorbed-dose(water) to absorbed-dose(silicon). This is accomplished by applying a conversion factor of 0.916 to the dosimeter response prior to calculating the absorbed dose. The conversion factor is calculated as the ratio of two integrals over the weighted photon fluence spectrum in the NIST 60Co Gammacell irradiator; the numerator is weighted with the photon mass energy-absorption coefficient for silicon, the denominator with that for water. The fluence spectrum is derived by Monte Carlo simulation of the NIST Gammacell geometry; as such the 0.916 factor is specific to the NIST irradiator.
A list of major program/service facilities and activities:
Irradiation FacilitiesFor industrial high-dose dosimetry calibration services, the measurement quantity of interest is absorbed dose (to water, primarily), reported in units of gray. At NIST, absorbed dose is realized by a water calorimeter in a gamma-ray field produced by a Vertical Beam 60Co source with an activity (as of April, 2007) of 48 TBq (1.3 kCi). The technical specifications for this source were described previously. Because the dose rate for this source is relatively low, a high-dose-rate source is needed to perform customer calibrations. The bulk of the services are provided through three Gammacell 220 60Co irradiators (Nordion, Canada). Their activities are: 37 TBq (1.0 kCi, serial number GC45); 137 TBq (3.7 kCi, serial number GC232); and 666 TBq (18 kCi, serial number GC207), all as of April, 2007. Though also available for service operation, the Pool 60Co source (0.15 kCi as of April, 2007) is rarely requested for calibration work by customers because its low absorbed-dose rate is no longer relevant to industrial needs; however, the Pool source continues to play an important role in the international comparisons described here.
Calibration of the gamma sources within the NIST high-dose calibration facility are performed by measuring the ratio of the alanine dosimeter response for the source being calibrated to that of a reference source. The absorbed dose for these internal calibrations is 1 kGy or less. This approach simplifies the source comparison to a measurement of two quantities, dosimeter response and time. Absorbed dose is not computed for this calibration exercise because these added steps will introduce additional uncertainties inherent in the calibration curve, and it avoids any issues that might arise from non-linearity in the dosimetry system dose response. Moreover, the very fact that this response-per- time calibration scheme was able to reveal the subtle rate-dependence described here is strong support for the validity of the method.
The four 60Co sources described here are used for NIST high-dose calibration services. Prior to 2004, the Pool source and the Gammacells were each calibrated by a direct dosimeter response ratio to the Vertical Beam 60Co source (Humphreys et al., 1998a). However, the Vertical Beam 60Co source dose rate has decayed to a level that requires excessive periods of time (>24 h) to perform comparisons at the absorbed doses (≥1 kGy) routinely used. Since the Vertical Beam 60Co source irradiations are performed under water, with the water surface in the vessel exposed to the room environment, there were concerns that a variation in the water level would contribute significantly to the uncertainty of the measurement, as it would be difficult to keep the water level constant for a prolonged period. To address the increasingly longer Vertical Beam irradiation times, modifications to the calibration scheme were developed. In 2004 the calibration scheme was modified so that the Vertical Beam 60Co source would be compared only to the Pool source. To improve several aspects of the measurement, the absorbed dose for this comparison was adjusted lower (140 Gy) for the Pool/Vertical-Beam source comparison. The three Gammacells are calibrated by comparison to the Pool source. The Pool source serves well as an intermediary source in the calibration scheme as its dose rate is closer to that of the Gammacells; this permits longer exposure times, resulting in reduced timer uncertainties.
The calibration scheme begins with the known dose rate of the Vertical Beam 60Co source. To transfer that dose rate, eight alanine pellets are irradiated in the calorimeter water tank. The pellets are stacked in a watertight polystyrene cylinder whose axis is fixed perpendicular to the Vertical Beam 60Co source at a scale distance of 58.8 cm. The water surface is set at a scale distance of 53.8 cm. This design differs from the published scheme (Humphreys et al., 1998a). In the published scheme, this irradiation was done in a polystyrene phantom and a scaling theorem was used to correct for differences in photon interaction cross sections between polystyrene and water. This direct underwater measurement was an improvement as it eliminated scaling theorem uncertainties. In the current calibration scheme the dosimeters are irradiated at the appropriate distance underwater, and no additional corrections are applied to the measured data.
Concurrent to the Vertical Beam irradiation, alanine dosimeters are irradiated to the same absorbed dose (described in Humphreys et al., 1998b) in the absolute center of the isodose region of the Pool-source gamma field. This comparison may be repeated as necessary to achieve an acceptable precision of 0.5 %. The dosimeters are measured using EPR, and the dosimeter response is divided by the irradiation time to convert to units of response/s. Once the measurements are converted to these common units, the established dose rate in the Vertical Beam source can be used to determine the dose rate in the Pool source.
Similarly, a series of comparisons are made between the dose rates at the center positions of the Pool source and the three Gammacells (GC45, GC232, and GC207) with the alanine transfer vial placed on a polystyrene pedestal set to position the dosimeters in the absolute center of the isodose region of the gamma field. For these comparisons a higher dose is used (e.g., 1 kGy) to reduce the contribution of uncertainty in the timer settings for the highest dose-rate Gammacell. In the GC232 and GC207, irradiations are performed on a pedestal either in a stainless-steel dewar or without a dewar; the dewar is used to improve temperature control at the extremes of the irradiation temperature ranges. The dosimeters are measured and the response/s is determined. The center-position dose rate for each Gammacell is determined by comparison to the Pool source center-position dose rate.
It should be noted that the equivalent transit time, the time subtracted from the timer setting that accounts for the absorbed dose received by the dosimeters during the delivery of the dosimeters to and from the irradiation position, is determined for each source. To measure the equivalent transit time, alanine dosimeters are irradiated for a series of very short times. Typically, these times are 5 s, 10 s, 20 s, 30 s, 40 s, and 50 s. The dosimeter response is measured and plotted versus irradiation time. A linear regression of these data is extrapolated to the x axis. The absolute value of the x intercept is the equivalent transit time.
Customer-supplied dosimeters for calibration are routinely irradiated in one of three calibration geometries: ampoule, Perspex, and film block. The rates for each of these positions in the Gammacells (though not all positions are used in each Gammacell), with and without a dewar present, are determined by comparison of the response/s for dosimeters irradiated in these positions to the response/s for dosimeters irradiated in the center position of the respective Gammacell. This final portion of the calibration scheme remains unchanged from that previously published (Humphreys et al., 1998a). As a final check of the dose rates, all irradiation geometries are compared to confirm an equivalent measurement response for dosimeters irradiated to 1 kGy.
The Electron Paramagnetic Resonance (EPR) dosimetry facility and the gamma-ray irradiators that support the measurement system are unique in design and capabilities. The research efforts in radiation metrology provided by this facility have enabled U.S. industries to advance several new technologies, especially in health related areas, and improve U.S. manufacturing efficiency. In recent years a new measurement service class EPR spectrometer was procured (for $150k) and is being integrated into the system. More recently, an MOU was signed between NIST and the Uniformed Services University of the Health Sciences to transfer a state-of-the-art EPR spectrometer to NIST for collaborative research.
The international comparison described below is the second on conducted. The first comparison, though conducted in the 1990's, was published in 2006.
NIST participated in a recent international comparison organized by the BIPM (details below). At the request of NIST, the comparison was preceded by a NIST-led comparison to demonstrate NIST-recommended modifications to the BIPM protocol. The NIST comparison demonstrated the need to know the participant's calibration scheme and dose rates, as well as the inclusion of an additional dose level (1 kGy, in addition to 5, 15, 30 kGy). The rationale to modify the protocol were initiated by the recent NIST characterization of a rather obscure and previously unknown dose rate effect in alanine. The effect was such that its greatest impact was on NMI metrology. The discovery of this effect and its contribution to international comparison data is a demonstration of the quality of NIST measurement capabilities and its associated facilities, and is a direct result of the quality system implemented at NIST in the last decade. The effect contradicted commonly accepted characteristics of the alanine system, and the initial reports of the effect was met with criticism from other NMIs, however it has since been reproduced and is now accepted as fact.
Eight national standards for absorbed dose to water in 60Co gamma radiation at the dose levels used in radiation processing have been compared over the range from 1 kGy to 30 kGy using the alanine dosimeters of the NIST and the NPL as the transfer dosimeters. The comparison was organized by the Bureau International des Poids et Mesures, who also participated at the lowest dose level using their radiotherapy-level standard for the same quantity. The national standards are in general agreement within the standard uncertainties, which are in the range from 1 to 2 parts in 102.
Eight laboratories offering high-dose irradiation services took part in the comparison; the Czech Metrology Institute Inspectorate for Ionizing Radiation (CMI-IIR, Czech Republic), the Istituto Nazionale di Metrologia delle Radiazioni Ionizzanti (ENEA-INMRI, Italy), the Laboratoire National Henri Becquerel (LNE-LNHB, France), the National Institute of Metrology (NIM, China), the National Institute of Standards and Technology (NIST, USA), the National Physical Laboratory (NPL, UK), the High Dose Reference Laboratory of the Danish Technical University (Risø-HDRL, Denmark) and the Institute for Physical-Technical and Radiotechnical Measurements, Rostekhregulirovaniye of Russia (VNIIFTRI, Russian Federation). All laboratories hold primary standards with the exception of the CMI-IIR and the Risø-HDRL, who hold secondary standards traceable to the IAEA and the NPL, respectively. In addition, the BIPM, although it does not offer a high-dose service, took part at the lowest dose level (1 kGy) to provide a direct link to the international reference for absorbed dose to water in 60Co. Two transfer dosimeters were used for the comparison; the alanine/ESR dosimetry system of the NIST and that of the NPL.
Therapy Level Alanine Dosimetry:
Recent developments in radiotherapy have significantly increased the use of small fields in stereotactic procedures and IMRT fields composed of small fields. Ionization chambers are not always suitable where situations of high dose gradients, time-dose variance, and non-uniform beam distributions are encountered. Volume averaging and lack of electronic equilibrium complicate the use of ionization chambers for the dosimetry of small photon beams. An international protocol is under development by the IAEA and the AAPM to address the difficult problems of small-field dosimetry in radiation therapy (e.g., Gamma Knife, IMRT, Cyber Knife, TomoTherapy). One promising path from national standards in large reference fields is the use of alanine/EPR dosimetry. The project is being conducted on multiple levels in parallel: