PyTorch model PhUn for phase unwrapping
Original network was designed in TensorFlow framework, and this is the PyTorch version of it.
<pip install -r example-requirements.txt>
I've added following moments to the structure:
- Replication padding mode in conv3x3 blocks, because experiments have shown that it's important at the edges of phase maps, otherwise unwrapping quality will be low
Dataset was generated synthetically according to articles [2,3] So, dataset data was generated using two methods (in equal proportions):
- Interpolation of squared matrixes (with uniformly distributed elements) of different sizes (2x2 to 15x15) to 256x256 and multiplying by random value, so the magnitude is between 0 and 22 rad
- Randomly generated Gaussians on 256x256 field with random quantity of functions, means, STD, and multiplying by random value, so the magnitude is between 2 and 20 rad. From experiments with real simple phase images it's clear, that that method makes net more adapted for real-life examples
Model can be shown as following (from original article [1]):
In original paper authors describe train hyperparameters as follows:
loss: pixelwise MSE
optimizer: Adam
learning rate: 1e-4 "at start and than decreasing"
Succeed train to zero cost (0.096) with Adam 0.0001
- Code refactoring
- Gili Dardikman-Yoffe, Darina Roitshtain, Simcha K. Mirsky, Nir A. Turko, Mor Habaza, and Natan T. Shaked, "PhUn-Net: ready-to-use neural network for unwrapping quantitative phase images of biological cells," Biomed. Opt. Express 11, 1107-1121 (2020).
- K. Wang, Y. Li, K. Qian, J. Di, and J. Zhao, “One-step robust deep learning phase unwrapping,” Opt. Express 27, 15100–15115 (2019).
- Spoorthi, G. E. et al. “PhaseNet 2.0: Phase Unwrapping of Noisy Data Based on Deep Learning Approach.” IEEE Transactions on Image Processing 29 (2020): 4862-4872.