Hi,
I have a question regarding the usefuleness of using the retain_graph = True when doing the backward step using the discriminator.
Is it because because the discriminator will be used later on when training the generator ?
Thanks.
Hi Dhia_Znaidi!
Hope you are doing well.
Yes, the retain_graph=True
argument is used during the backward pass of the discriminator to retain the computational graph. This is done because the same discriminator network will be used again when training the generator.
By retaining the graph, you ensure that the gradients can be computed and propagated through the discriminator during the subsequent backward pass when training the generator. Without retain_graph=True
, the computational graph would be discarded after the first backward pass (the usual scenario, due to memory efficiency scenarios).
Regards,
Nithin
Great Thanks for the clarification
1 Like