C1W3 WGAN - taking gradients?

Okay, I’m missing something.
How do I “take the gradient of outputs with respect to inputs”?

The loss function is: E(c(x)) - E(c(g(x))
Is this the gradient? Exactly what expectation are we even taking here?

Hi @Cinemaster,

Up until now, we’ve used loss.backward() as part of backprop. loss.backward() calculates gradients for the tensors that were used to calculate the loss with respect to the loss, and stores the values in the .grad() property of those tensors. Then, optimize.step() uses those gradients to make small adjustments to weights as part of backprop.

Here, though, we want to get the value of a gradient so we can use it in calculating the gradient penalty. The function we use to return the gradient value is torch.autograd.grad(). In particular, we want to get the gradient of the critic’s score (output) with respect to the mixed image (input). The next step, once we know this gradient will be to use it to calculate the gradient penalty.

I hope this helps give a basic overview. There’s a little more info in the Gradient Penalty section of the assignment, just above the get_gradient code block. There’s also a link to the documentation for torch.autograd.grad() in the comments for get_gradient if you want to take a deeper look at that.

1 Like

Thank you, Wendy, that’s helpful.
I really wish this code had been covered in the lectures.