Live cell imaging shows the dynamics of single cells, and provides insight into phenotype regulation.
Characterizing and predicting cell population behavior is critical to the efficiency and flexibility of manufacturing processes of cell-based products. Characterization is challenging because individual cells within populations demonstrate heterogeneous and dynamic characteristics, and it is often unclear what measurements provide meaningful and predictive information about important population characteristics such as proliferative capacity or differentiation potential. Individual cells express emergent properties slightly differently and at different rates - data that can only be gotten by quantifying the dynamic characteristics of individual cells over time.
Emergent characteristics measured in cell populations are controlled by regulatory biochemical networks. Fluorescent proteins can report on gene expression of regulatory network components, and their temporal expression can indicate strong correlations – causative relationships – between network components.
Quantifying temporal fluctuations of reporter molecules in Induced pluripotent stem cells requires segmenting cells that exist very close to one another in colonies and tracking them over long times as they divide. This requires imaging each cell at a rate of about every 2 minutes, making it necessary to minimize the use of fluorescence and to train AI models to recognize and track cells in bright field imaging. And of course, one needs to be able to sample thousands of cells within that time frame, resulting in very large volumes of data.
Moving, processing and analyzing image data at these high volumes and rates require advances in computational methods. Efficient development of optimum analyses of living cells including AI pipelines will require methods for dynamic analysis as data are being taken. Massive integration of data will require software engineering and involve parallel and distributed computing. Of great importance to NIST is establishing the reliability of the measurements. This includes the use of benchmarking materials and protocols at the instrument, and AI validation on the computation side.