Deep learning networks find a new application on phase-contrast imaging

Data:10-06-2020  |  【 A  A  A 】  |  【Print】 【Close

Phase-contrast imaging has been widely applied in optical and X-ray measurements since the phase contains information about the 3D morphology and refractive index of the specimen. A phase map can be retrieved from a single-distance coherent diffraction image by exploiting the contrast transfer function (CTF) and the total variation (TV) regularization phase retrieval scheme.


However, the conventional CTF-TV phase retrieval algorithm, which is based on the traditional compressive sensing optimization problem, generally struggles in the presence of strong noise. Thus, how to retrieve the phase from a heavy noise scenario is still a challenge.


Recently, a research team led by Prof. YAO Baoli at State Key Laboratory of Transient Optics and Photonics, Xi’an Institute of Optics and Precision Mechanics (XIOPM) of Chinese Academy of Sciences (CAS), reported a robust phase-contrast imaging technology aided by deep learning networks (termed CTF-Deep) that overcomes the challenge. The CTF-Deep approach allows researchers to accurately retrieve phase maps from high-noise diffraction images by utilizing a unified regularization scheme. This work was published in the journal Optics Letters (impact factor 3.866, 2019).


By integrating a neural network into a CTF-based optimization algorithm, the CTF-Deep inherits the strengths of both the optimization and the deep-learning techniques. Compared to current optimization-based algorithms, such as analytical CTF inversion, CTF-TV, and BM3D denoising, which are severely affected by noise and distortions, the CTF-Deep is exceptionally robust, due to the use of the DnCNN image denoising neural network.

 

Also, the CTF-Deep method shows the great potential to become a useful tool for other phase-contrast imaging applications, such as in x-ray or terahertz imaging.

 

 The configuration of CTF-Deep network (left) and the phase retrieve result of CTF-Deep comparing to current methods (right). (Image by XIOPM)


This research was funded by the National Natural Science Foundation of China with No. 81427802, 61905277, and 61705256.