C1W1 test_gen_loss error : RuntimeError: element 0 of tensors does not require grad and does not have a grad_fn

These error messages are always a bit hard to interpret. My guess is that you’re missing the meaning there: the error occurs in gradient processing and the error is telling you that you have not enabled gradient computation on the relevant variables. The situation is fundamentally asymmetric: when you compute gradients for the discriminator, you can detach the generator because you don’t need the gradients of the generator. But when you train the generator, you can’t do that, because you need both the gradients of the generator and the discriminator in order to train the generator. That is because the cost is the output of the discriminator. You don’t apply the gradients to the discriminator in that case, but you still need them. Here’s another thread from a while ago that discusses these points.

So my guess is that perhaps you did something like detaching the generator in this case.