# Convolutional Neural Networks W1 lab problem

I’m trying to finish lab 1 and I have a problem in my understanding what is going wrong there.

conv_forward function test

I initialized Z with zeros following the instruction

Z = np.zeros((m, n_H, n_W, n_C))

The issue is possibly in how you computed n_H and n_W, or maybe inside your conv_single_step() function.

well… I specially separated variable assignment and conv_single_step execution to check where I have an issue. and it looks like that conv_single_step works fine. Or at least it provides some value.

I used these formulas

to calc n_H and n_W

and if print them to output I get

m = 2
n_H = 3
n_W = 4
n_C = 8

In addition to those possibilities, another thing to check is the ranges on your loops over h and w. There are a number of common mistakes there:

1. Don’t include the stride in the range: you are looping over the output space, so you must touch every location. The striding happens in the input space.
2. One other mistake is switching the meaning of h and w. Remember that h stands for “height” not “horizontal”.

Here’s a thread which lays out the algorithm here in words that is definitely worth a look.

Those look correct. Here’s my output with some prints added:

``````stride 2 pad 1
New dimensions = 3 by 4
Shape Z = (2, 3, 4, 8)
Shape A_prev_pad = (2, 7, 9, 4)
Z[0,0,0,0] = -2.651123629553914
Z[1,2,3,7] = 0.4427056509973153
Z's mean =
0.5511276474566768
Z[0,2,1] =
[-2.17796037  8.07171329 -0.5772704   3.36286738  4.48113645 -2.89198428
10.99288867  3.03171932]
cache_conv[0][1][2][3] =
[-1.1191154   1.9560789  -0.3264995  -1.34267579]
First Test: All tests passed!
New dimensions = 9 by 11
Shape Z = (2, 9, 11, 8)
Shape A_prev_pad = (2, 11, 13, 4)
Z[0,0,0,0] = 1.4306973717089302
Z[1,8,10,7] = -0.6695027738712113
New dimensions = 2 by 3
Shape Z = (2, 2, 3, 8)
Shape A_prev_pad = (2, 5, 7, 4)
Z[0,0,0,0] = 8.430161780192094
Z[1,1,2,7] = -0.2674960203423288
New dimensions = 13 by 15
Shape Z = (2, 13, 15, 8)
Shape A_prev_pad = (2, 17, 19, 4)
Z[0,0,0,0] = 0.5619706599772282
Z[1,12,14,7] = -1.622674822605305
Second Test: All tests passed!
``````

You can see my Z dimensions agree with yours on the first test case. So I’m guessing it’s one of the two issues I listed in my previous reply.

thank you for your help. I finally managed to find error in my code. I missed looping range and wrote “for h in range(n_H)”

but after fixing I got this

I started to look there I could forgot stride. I changed the code to this

{moderator edit - solution code removed}

But i’m still getting wrong results

Your this code is correct. Maybe your weights and biases are wrong. Or `a_slice_prev `?

The `a_slice_prev` is the slice of `a_prev_pad`. It means you have to use `a_prev_pad` to grab `a_slice_prev`.

And, you have to grab weights and biases from W and b, respectively.

If the issue persists, let us know.

1 Like

I agree with Saif that the stride code you showed all looked correct, but there are other land mines to step on here. It might help to see your actual results, but probably not. Other mistakes I’ve seen that are maybe less obvious are not managing the indentation correctly so that the loop over c ends up not being nested under the other three loops.

One other possibility is that you fixed the code, but didn’t actually click “Shift-Enter” on the function call and just ran the test again. That runs the old code. One sledge hammer way to make sure everything is in sync and What You See Is What You Get is:

``````Kernel -> Restart and Clear Output
Cell -> Run All
``````

If none of the above suggestions sheds any light, then it’s probably time to look at your code. We should do that directly here, but we can create a DM thread for that. I’ll send you a DM about that.

thanks. I will just try to do this myself one more time after reading the thread you mentioned and only after this I will message you for help

Ok, thanks. finally i managed to find the error, I missed verticals and horizontals while slicing

1 Like

Glad to hear that you found the solution under your own power. Onward!