While implementing back prop for the convolution layer. there was a line:
da_prev_pad[vert_start:vert_end, horiz_start:horiz_end, :] += W[:, :, :, c] * dZ[i, h, w, c]
, inside the nested for loop running through i, h, w, c that worked well.
But it seems to me as though, another correct assignment operation with just “=” doesn’t work at all, as the LHS represents different slices of da_prev_pad for different iterations of the for loop, we are working on different segments, so, it shouldn’t really matter whether we use “+=” or just “=”.
Please clarify this doubt. I have similar doubts in the implementation of back prop for pooling.
Thank you.