Definitions of the Limit of Detection (LoD) based on the probability of false positive and/or false negative errors have been proposed over the past years. Although such definitions are straightforward and valid for any kind of analytical system, the proposed methodologies to estimate the LoD are usually simplified to signals with Gaussian noise. Additionally, there is a general misconception that two systems with the same LoD provide the same amount of information on the source regardless of the prior probability of presenting a blank/analyte sample. Based upon an analogy between an analytical system and a binary communication channel, in this paper we show that the amount of information that can be extracted from the analytical system depends on the probability of presenting the two different possible states. In order to introduce a priori knowledge in the LoD estimation, we propose a new definition of LoD utilizing Information Theory tools that deals with noise of any kind. Instead, the new definition is based on the amount of information that the chemical system can extract from the information source. Our findings indicate that the benchmark of analytical systems based on the ability to provide information about the presence/absence of the analyte (our proposed approach) is a more general and proper framework, while converging to the usual values when dealing with Gaussian noise.
Citation: Analytica Chimica ACTA
Pub Type: Journals
Limit of Detection, Information Theory, Mutual Information, Heteroscedasticity, False positive/negative errors, Gas Discrimination and Quantification