NIST developed the alanine dosimetry system in the early 1990's to replace radiochromic dye film dosimeters. Later in the decade the alanine system was firmly established as a transfer service for high-dose radiation dosimetry and an integral part of the internal calibration scheme supporting these services.
Detailed descriptions of the NIST high-dose irradiation facilities and the alanine dosimetry system can be found in the Accomplishments section of this web page below. Service descriptions and price schedule for NIST high-dose services are found in this link.
The intention of the following topics and supporting information is that it be used as a resource for users of these high dose services. This document provides additional information that is either newly developed or unpublished but pertinent to the application or interpretation of service information (e.g., NIST certificates).
The influence of absorbed dose rate:
Check standards are used by the NIST Ionizing Radiation Division to monitor the performance of the alanine dosimetry system that is central to its high-dose transfer dosimetry service. These measurements are performed to confirm the operational readiness of the calibration curve. Deviations from the expected check standard values can result from a wide range of sources that include manufacturing abnormalities in a dosimeter and spectrometer-related changes. A few years ago, check-standard measurement deviations unveiled a previously unknown rate effect for the alanine dosimetry system (Desrosiers et al., 2008). This rate-effect study characterized a complex relation between the radiation chemistry of crystalline alanine and the applied dose rate that was also dependent on the absorbed dose. That the rate effect only becomes significant above 5 kGy likely contributed to it only recently being discovered despite decades of research in alanine dosimetry. It was learned that the effect is intrinsic to alanine and is not dependent on the chemical form or manufacturing formulation of the alanine dosimeter. The study postulated that the production of one (or more) of the radiation-induced alanine radicals is dependent on the dose rate.
A follow-up study (Desrosiers and Puhl, 2009) aimed to investigate the influence of irradiation temperature on the dose rate effect. No increase in the effect was found with increasing temperature, but the dose rate effect appeared to be nonexistent at irradiation temperatures of -10 °C and -40 °C.
In summary, it is known that:
- the rate effect is estimated to be
- zero at dose rates above 2 Gy/s
- significant at some value below 2 Gy/s, and
- clearly measurable at 1 Gy/s
- the rate effect depends on absorbed dose, it
- is not measurable at 1 kGy or less
- becomes significant above 5 kGy
- reaches a maximum effect at about 30 kGy
- for doses above 5 kGy, the magnitude of the effect is not dependent on the relative values of the high and low rates compared, but rather only whether the two rates compared fall above 2 Gy/s and below 1 Gy/s. If dosimeters irradiated with dose rates from either of these two categories are compared the effect is measured and if the dose is above 30 kGy the effect is maximized.
Temperature coefficient studies:
The response of high-dose-range chemical dosimeters is dependent on the dosimeter temperature during irradiation. Typically, irradiation temperatures are estimated by measurements, calculations, or some combination of the two. Then using the temperature coefficient for the dosimetry system, the dosimeter response is adjusted or corrected to be consistent with the irradiation temperature for the calibration curve. Consequently, the estimation of irradiation temperature and the response correction via the temperature coefficient are sources of uncertainty in industrial dosimetry.
Studies to characterize the temperature coefficient for irradiation of dosimeters below ambient temperatures was completed several years ago (Desrosiers et al., 2004) on the only commercial dosimeter available at that time. A study of the influence of irradiation temperature on modern commercial alanine dosimeters is now complete (Desrosiers et al., 2012).
Conversion of absorbed-dose(water) to absorbed-dose(silicon):
Upon request from customers ordering NIST calibration of dosimeters for radiation hardness studies, NIST will convert its default absorbed-dose(water) to absorbed-dose(silicon). This is accomplished by applying a conversion factor of 0.916 to the dosimeter response prior to calculating the absorbed dose. The conversion factor is calculated as the ratio of two integrals over the weighted photon fluence spectrum in the NIST 60Co Gammacell irradiator; the numerator is weighted with the photon mass energy-absorption coefficient for silicon, the denominator with that for water. The fluence spectrum is derived by Monte Carlo simulation of the NIST Gammacell geometry; as such the 0.916 factor is specific to the NIST irradiator.