Stuck in conv_backward - Step_by_Step

Stuck in below error:

for padding

A_prev_pad = zero_pad(A_prev, pad)
dA_prev_pad = zero_pad(dA_prev, pad)

current selection:

select ith training example from A_prev_pad and dA_prev_pad

    a_prev_pad = A_prev_pad[i]
    da_prev_pad = dA_prev_pad[i]

ValueError Traceback (most recent call last)
in
10
11 # Test conv_backward
—> 12 dA, dW, db = conv_backward(Z, cache_conv)
13
14 print(“dA_mean =”, np.mean(dA))

in conv_backward(dZ, cache)
65 print(da_prev_pad[vert_start:vert_end, horiz_start:horiz_end, :])
66 print(‘W shape:’,W[:,:,:,c].shape)
—> 67 da_prev_pad[vert_start:vert_end, horiz_start:horiz_end, :] += (W[:,:,:,c] * dZ[i, h, w, c])
68 dW[:,:,:,c] += a_slice * dZ[i, h, w, c]
69 db[:,:,:,c] += dZ[i, h, w, c]

ValueError: operands could not be broadcast together with shapes (2,0,3) (2,2,3) (2,0,3)
Can anyone guide me?

I’m not sure I can explain that error message, as the broadcast there should only involve at most two elements. But I think that line of code looks correct, so my guess is that the problem is with the shape of the LHS, meaning that the real issue is probably with how you are computing the vert_start, vert_end and so forth or with how you are managing the stride in general. The stride must not be used in the range of the loops here: they work just the same way as they did in conv_forward. If your conv_forward code passes the tests, please compare how the loop ranges and the vert_start, vert_end and so forth work here with conv_forward.