Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Artifacts in the inference output #8

Open
kasivisu82 opened this issue Nov 12, 2018 · 3 comments
Open

Artifacts in the inference output #8

kasivisu82 opened this issue Nov 12, 2018 · 3 comments

Comments

@kasivisu82
Copy link

Hi,

i've used your implementation for SRGAN. The steps are below.
i've used 800 images from div_2k as training dataset and 90 images from div_2k as test images.
I've ran initially SRRESNET with VGG54 for 10 power 5 iterations. Then used the obtained weights to initialize (--load) for SRGAN with VGG54 and ran for another 10 power 5 iterations.

The PSNR and SSIM are given below for Set5, Set14, BSD100 as well as Div_2K's 90 images (all average values):
[BSD100] PSNR: 25.18, SSIM: 0.6398
[Set14] PSNR: 26.25, SSIM: 0.6966
[Set5] PSNR: 29.33, SSIM: 0.8370
[div2k-90] 26.48, SSIM: 0.6984

But some of the images (from all the 4 sets given above) had artifacts. (Please see the attached images).
The artifacts are present in some iterations and not present in some iterations. But the iterations with low training loss also has artifacts.

Can you help me with the following questions?

  1. Should i also train the SRGAN with VGG54 for another 10 pow 5 iterations with learning rate 1e5? will it solve the artifact issue?
  2. Why are the artifacts appearing? is there a fix / saturation needs to be done?
  3. Have you encountered these artifacts? Any way i can correct them?
  4. In the files that i've attached, 195500 had least training loss, but had artifacts. 198500 had more worse training loss, but didn't have any artifacts. Which iterations's weights shall i choose?

195500_hr
195500_out
198500_hr
198500_out

Expecting your reply, as i'm struck in this a bit.

@trevor-m
Copy link
Owner

trevor-m commented Nov 13, 2018

I've ran initially SRRESNET with VGG54 for 10 power 5 iterations.

How do the results of this training look? If I recall correctly, in the paper they train SRResNet with MSE for 10^6 iterations as the initial seed (even for SRGAN VGG54) (10^5 iterations should be fine too). I've had problems with bad local minima when using VGG54 loss without the GAN, so maybe this could be related to your problem.

I haven't encountered these artifacts with SRGAN, but with a similar architecture I experienced the exact same artifacts from iteration to iteration. I suspect that it could just be due to the instability of GANs. You can pick just results from an iteration without artifacts if you are primarily concerned about visual quality. If you only care about the metrics then pick the iteration with the best metrics - the metrics don't correspond with visual quality that well anyways.

@kasivisu82
Copy link
Author

Hi Trevor,

Thank you for your clarifications.

Based on our discussion on your feedback, we've decided to try a few things, which are given below.

  1. Change the scaling to 4X, by running the upscaling block in Generator only 1 time, instead of 2 times.
    Hope this change to topology of Generator is sufficient. Please comment.
  2. Run
    a) SRRESNET with VGG22 for 10^6 iterations with learning rate 1e4.
    b) SRGAN with VGG22 with Weights from a, for 10^5 iterations with learning rate 1e4.
    c) SRGAN with VGG22 with Weights from b, for 10^5 iterations with learning rate of 1e5.
  3. I tried to run a) with MSE, but i'm getting error while i load MSE weights obtained from a) and try initialize/load it for b).
    Can you plz suggest the ideal commands for a proper MSE-based SRRESNET, followed by SRGAN commands?

Kindly let me know your comments on all the 3 steps.

@trevor-m
Copy link
Owner

trevor-m commented Jan 24, 2019

Sorry for the delay.

  1. Yes, that is a good way to change the scaling to 4x.
  2. That looks correct. How does the output look?
  3. That is strange, what errors are you getting?

This sequence should reproduce the paper's result:

python train.py --name srresnet-mse --content-loss mse --train-dir path/to/dataset
# wait for srresnet-mse to train for 10^6 iterations
python train.py --name srgan-vgg54 --use-gan --content-loss vgg54 --train-dir path/to/dataset --load results/srresnet-mse/weights-1000000
# wait for srgan to train for 10^5 iterations
python train.py --name srgan-vgg54 --use-gan --content-loss vgg54 --train-dir path/to/dataset --load results/srgan-vgg54/weights-100000 --learning-rate 1e-5
# wait for srgan to train for 10^5 iterations

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants