Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

when I train the model ? The d_loss is always 0.0 ? Is there some problems ? #40

Open
IwenLeeO opened this issue Sep 28, 2020 · 4 comments

Comments

@IwenLeeO
Copy link

Elapsed [0:05:20.807550] batch: 252 d_loss: 0.000000 g_loss: 9.195433
Elapsed [0:05:32.021947] batch: 261 d_loss: 0.000000 g_loss: 9.544848
Elapsed [0:05:43.313830] batch: 270 d_loss: 0.000000 g_loss: 8.533874
Elapsed [0:05:54.605321] batch: 279 d_loss: 0.000000 g_loss: 8.899170
Elapsed [0:06:06.190011] batch: 288 d_loss: 0.000000 g_loss: 18.478725
Elapsed [0:06:17.612306] batch: 297 d_loss: 0.000000 g_loss: 7.378197

@ForawardStar
Copy link

I also have the same issue, have you solved it?

@foxy297feng
Copy link

same...

@akanimax
Copy link
Owner

akanimax commented Apr 12, 2021

Hello!
The default GAN_loss function being used in this repo is relativistic-hinge. Please refer to this paper for more about this loss function: https://arxiv.org/abs/1807.00734. The discriminator loss for this variant of the GAN_loss remains 0.0 for most of the training, but the GAN trains properly.

@foxy297feng
Copy link

foxy297feng commented Apr 15, 2021 via email

@thuangb thuangb mentioned this issue Jun 17, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants