# C4 W1 A1 : concolution_model_step_by_step 1 ValueError: operands could not be broadcast together with shapes (3,3,4) (3,3)

my code:
{Moderator Edit: Solution Code Removed}
input:

``````np.random.seed(1)
A_prev = np.random.randn(2, 5, 7, 4)
W = np.random.randn(3, 3, 4, 8)
b = np.random.randn(1, 1, 1, 8)
hparameters = {"pad" : 1,
"stride": 2}

Z, cache_conv = conv_forward(A_prev, W, b, hparameters)
z_mean = np.mean(Z)
z_0_2_1 = Z[0, 2, 1]
cache_0_1_2_3 = cache_conv[0][1][2][3]
print("Z's mean =\n", z_mean)
print("Z[0,2,1] =\n", z_0_2_1)
print("cache_conv[0][1][2][3] =\n", cache_0_1_2_3)

conv_forward_test_1(z_mean, z_0_2_1, cache_0_1_2_3)
conv_forward_test_2(conv_forward)

``````

output:

``````a_slice :  (3, 3, 4)
weights :  (3, 3)
biases  ()
---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
<ipython-input-65-7e580406a9e8> in <module>
6                "stride": 2}
7
----> 8 Z, cache_conv = conv_forward(A_prev, W, b, hparameters)
9 z_mean = np.mean(Z)
10 z_0_2_1 = Z[0, 2, 1]

<ipython-input-64-4f47a790c95a> in conv_forward(A_prev, W, b, hparameters)
97                     biases=b[0,0,0,c]
98                     print("biases ", biases.shape)
---> 99                     Z[i,h,w,c]= np.sum( np.multiply(a_slice_prev,weights),biases)
100
101

ValueError: operands could not be broadcast together with shapes (3,3,4) (3,3)

``````
• how do i get n_C_prev and n_C for W, b . if c starts from 0 to n_C
• what would be the n_C_prev value?
• how to get W and b and then compute Z ?
• please help, i checked previous forum questions, but i didnt find any answer to the similar doubt

Please note that sharing your code is not allowed. I am deleting it after this reply.

For `Z[i,h,w,c]`, you have to call `conv_single_step` function with proper arguments. No need to use sum or multiply.

Also, your `weights` implementation is wrong. It should be [`everything`, `everything`, `everything`,` that one`]. In Python, for `everything`, we use `:` and for `that one`, I mean that loop (for example, `c`).

Moreover, your `vert_start`, `vert_end` , `horiz_start`, and `horiz_end` are wrong. You forget the stride factor.

maybe this guide helps you further.

Best,
Saif.

Hey i know that the code should not be put, i just put it , because i wanted a exact feedback.

you can give feedback and remove the code, i am fine with it.

thanks for linking the guide, i will look into it.

edit: the guide worked, thankyou .

but why should only the current_channel in weights change and not previous_channel? i didnt get that?

`````` for loop:
weights=W[:,:,:,c]
biases=b[0,0,0,c]
``````

here the dimensions of Weights and biases are

`````` W -- Weights, numpy array of shape (f, f, n_C_prev, n_C)
b -- Biases, numpy array of shape (1, 1, 1, n_C)
``````

so shouldnt the code be

`````` for loop:
weights=W[:,:,c-1,c]
biases=b[0,0,c-1,c]
``````

or is prev channels ignored because , it is computed in every single loop and we just compute the current channel and previous channel is computed automatically?