Why do we set the retain_graph = True in the discriminators loss?

Hi, I was wondering why we set the retain_graph = True in the discriminators backward call since the generator is updated independently (or so I understood)?

Thanks for any pointers!

backward usually releases variables to save memory. If we pass retain_graph = True, the graph won’t be deleted, so you can calculate gradient again (here is a nice explanation).

I don’t see any reasons why we should save the graph for crit_loss.backward(retain_graph=True), as we don’t use the crit_loss variable anywhere else.

2 Likes