Determination of the colloid concentration for a given single-wall carbon nanotube (SWCNT) dispersion is a basic requirement for many studies. The commonly used optical absorption based concentration measurement is complicated by the spectral change due to variations in nanotube chirality and length. In particular, the origin of the observed length-dependent spectral change and its effect on concentration determination has been the subject of considerable debate. Here, we use length-fractionated DNA-wrapped SWCNTs to establish the relationship between carbon nanotube concentration and optical absorption spectra by directly quantifying the amount of wrapping DNA and, independently, the DNA/carbon nanotube mass ratio. We find that the customary nanotube concentration measurements, based on the E11 absorption peak or on the spectral baseline, do not correctly represent the concentration of SWCNTs. Instead, a new method, the spectral integration of the E11 optical transition region, was most closely correlated with the measured nanotube concentration. Finally, we observe that shorter DNA-SWCNT fractions contain more curved carbon nanotubes, and propose that these nanotubes, presumably highly defective, contribute significantly to the baseline increase in the absorption spectra of shorter nanotube fractions.
Citation: Analytical Chemistry
Pub Type: Journals
carbon nanotube, extinction coefficient, concentration measurement