Hi,
I was able to get the full marks for the assignment, but I am still learning PyTorch: I had a question on the use of retain_graph for the discriminator backward call in the provided code.
disc_loss.backward(retain_graph=True)
I checked the PyTorch documentation and forums to understand the parameter. As far as I understand, this parameter is used to retain the graph used for backpropagation calculation to call the backward again on the full or the portion of the graph. I also saw some examples where this would be needed, but I don’t see such requirements in our assignment, and removing this also seems to work fine for me. Is this understanding correct? If not, please help me understand the utility of the parameter here.
Thank you very much!