Cutting-edge nanoscale microscopy techniques such as near-field scanning microwave microscopy (NSMM) and atom probe tomography (APT) enable a multi-dimensional virtual representation of an artifact to be constructed from a set of experimental measurements. These imaging techniques, which are currently being advanced by NIST physicists, can yield powerful insights into physical material properties, with impacts ranging from fundamental physics to U.S. industrial competitiveness.
However, quantitatively accurate NSMM and APT images with meaningful associated uncertainties remain elusive. In imaging methods such as these, no analytical physical or empirical statistical model is available from which to directly, accurately, and automatically construct the image of an artifact of arbitrary shape/composition from experimental measurements. For these imaging technologies to reach their full potential, this problem must be overcome. Deep ML networks offer a natural solution; they have the power and flexibility to model the full feature space of a multi-dimensional dataset. Indeed, ML techniques could be crucial to unlocking the full potential of APT and NSMM imaging, allowing nanoscale artifacts to be virtually reconstructed with unprecedented fidelity. However, deep ML networks nominally require large sets of training data, which are not available for time-intensive experimental techniques such as NSMM and APT. The scarcity of training data is a key roadblock limiting the ability of ML to enhance these powerful microscopy techniques.
This project addresses the problem of scarce training data for deep ML networks in NSMM and APT. In particular, we are exploring the use of subspace statistical regression (SR) modeling and a suitably structured deep ML network to develop methods to automatically generate robust synthetic datasets to augment experimental data for the purpose of ML network training. Achieving our project goals will 1) demonstrate the capability of subspace modeling and data augmentation and make a novel contribution to ML generally, and 2) with this new ML capability, position us to go on to solve difficult problems – including APT and NSMM image construction – presently challenging NIST physicists. In the future, the results from this work could be used to establish a statistical approach and a corresponding deep learning architecture that would be able to automatically create subspace-validated solutions with attached uncertainties in techniques such as APT and NSMM.