The symmetric geometry of a grating interferometer is advantageous because it gives a maximum phase-sensitivity when the sample is at the phase grating position. However, the spatial resolution is reduced due to the geometric blur due to the increased sample-to-detector distance. The trade- off between phase-sensitivity and spatial resolution is a fundamental challenge in such interferometric imaging applications with either neutron or conventional x-ray sources due to their relatively large beam-defining apertures or focal spots. In this study, a deep learning method is introduced to estimate a high phase-sensitive and high spatial resolution image from a trained neural network to attempt to avoid the trade-off for both high phase-sensitivity and high resolution. To realize this, the training data sets of the differential phase contrast images at a pair of sample positions, one of which is close to the phase grating and the other close to the detector, are numerically generated and are used as the inputs for the training data set of a generative adversarial network (GAN). The trained network has been applied to the real experimental data sets from a neutron grating interferometer and we have obtained improved images both in phase- sensitivity and spatial resolution. Furthermore, the results were compared with the counterparts obtained using a traditional image processing method to demonstrate the superiority of our proposed deep learning approach.
neutron imaging, grating interferometry, machine learning, generative adversarial network