retain_graph=True?

Hello,

I understood from the lectures that we don’t need the Generator weights to be updated when training the discriminator so we want the generator to be detached, but when training the generator, depending upon the output from the discriminator, the loss is calculated so are the gradients computed. So, we should not detach the discriminator.
correct me if I am wrong.

And we applied retain_graph=True on discriminator and what actually is it doing? Could anyone please explain me?

Thanks in advance.

Trying to backward through the graph a second time (or directly access saved tensors after they have already been freed). Saved intermediate values of the graph are freed when you call .backward() or autograd.grad(). Specify retain_graph=True if you need to backward through the graph a second time or if you need to access saved tensors after calling backward.

retain_Graph = true, using this parameter, you can save the gradient of the previous backward() in the buffer until the update is completed.

.backward() function computes the gradients of the cost function. I couldn’t understand what you told me. Could you please elaborate?

When backward function is called all the grad computed will be lost and it cannot be used for second time computation for generator so we will specify retain_graph=true so that those same gradients can be used next time also, without re running the entire process for generator.