Saving computational graph during discriminator backpropagation

In the first assignment “Your First GAN”, during the discriminator update, we are calling disc_loss.backward(retain_graph = True) to save the computational graph.

  1. How is such saved computational graph useful during the generator update?
  2. Do we need to retain the computational graph also when we are back propagating the generator?

Thanks in advance.

1 Like

The gradients for the generator go through the discriminator by definition, so we need the discriminator graph when training the generator. But the situation is asymmetric: when we train the discriminator, we don’t need the generator’s gradients. Here’s a thread which discusses this in more detail. Here’s another thread that talks about the difference between “detach” and “retain_graph”.

2 Likes