Course 4 W1 ProgAssignment 1 Ex5 Backprop

I have tried these print statements at different places to pinpoint the error and it seems calculation of dW step has a shape mismatch issue. But I have used the formula as mentioned and can’t understand the error.
Any help is appreciated

One correction in the print statements shown above:
a_slice.shape = (10,4,4,8) is actually dZ.shape that I have printed and not a_slice’s shape

EDIT: I just changed
a_slice = a_prev_pad[vert_start:vert_end, horiz_start:horiz_end, c] to
a_slice = a_prev_pad[vert_start:vert_end, horiz_start:horiz_end, :]
and it worked. But can anyone help[ with the intuition behind this?

You have to keep track of the fact that there are two different sets of “channels” in convolution: the input channels and the output channels. In both forward and backward propagation, you are looping over the output space, so the c there is the current output channel in the loop. In the backward case, you are mapping from the one current output channel back to all the input channels of A from which that position in the output was generated, right?

1 Like

Ohh gotcha, and that’s also why we are summing up all details for all the channels, right!

Yes, I think that’s right, if I’m understanding what you’re saying. The way to think about this is that forward and back propagation are literally the reverse of each other in terms of the shapes of what is happening. On the forward step, each position of the output is the “projection” from an f x f x nC_{in} shaped “patch” of the input. So when you reverse that, at each position of the output you are “projecting” the application of the gradients back into that same shaped “patch” of the input at least for the dA_prev part of the calculation.

1 Like

Now i can visualize what’s happening in backprop, thanks a lot