Stop Discriminator weights update while update Generator

Hi @DonFeto,
We can’t detach the discriminator when we compute the loss for the generator because the generator’s loss is calculated using the discriminator and we need to keep everything in that chain to properly calculate the generator’s results. It’s not an issue for the discriminator that we’ve calculated gradients for it because we only apply the gradients for the generator (by calling .step()), and we’re also doubly safe because we zero out the gradients every time before starting backprop (by calling zero_grad()).

For a more detailed discussion of this topic, take a look at Why should we detach the discriminators input ?!

3 Likes