NLP C2_W4_2 unit tests failing (back_prop)

Hello, I think I mostly have the back_prop function working, but two of the unit tests are failing related to the b1 vector. This is a pretty straightforward and I believe I am following the formula accurately. I am following similar steps to what I did in calculating grad_W1 using l1 and the z1 vectors. This is then computing using a ones vector created using np.ones((m,1)). My error from the failed test is provided below. Any thoughts?

Here is my output:

Wrong output values for gradient of b1 vector.
Expected: [[ 0.56665733]
[ 0.46268776]
[ 0.1063147 ]
[-0.17481454]
[ 0.11041817]
[ 0.32025188]
[-0.51827161]
[ 0.08430878]
[ 0.19341 ]
[ 0.08339139]
[-0.35949678]
[-0.13053946]
[ 0.19055422]
[ 0.56405985]
[ 0.13321988]]
Got: [[ 0.46004568]
[ 0.40483404]
[ 0.07318845]
[-0.0995559 ]
[ 0.01777417]
[ 0.14951921]
[-0.17889287]
[ 0.01897284]
[ 0.11459772]
[ 0.02604062]
[-0.32940889]
[-0.11874333]
[ 0.04899498]
[ 0.06254933]
[ 0.02570647]].
Wrong output values for gradient of b1 vector.
Expected: [[ 0.01864644]
[-0.31966546]
[-0.3564441 ]
[-0.31703253]
[-0.26702975]
[ 0.14815984]
[ 0.25794505]
[ 0.24893135]
[ 0.05895103]
[-0.15348205]]
Got: [[ 0.0183693 ]
[-0.25618039]
[-0.01923447]
[-0.06038765]
[-0.12080929]
[ 0.10414779]
[ 0.08565018]
[ 0.0896111 ]
[ 0.05432164]
[-0.14636517]].
14 Tests passed
2 Tests failed

You must have almost everything correct if it’s only the b1 values that differ. As you say, the code should be pretty straightforward and it’s actually easier than the way they explain it: you really just need the sum of l1 and the factor of 1/m. That’s all the dot product with the vector of ones gets you. One thing to note is that you don’t actually need z1 directly in that line of code, since it’s already “baked into” the l1 value that they computed for you.

If that suggestion doesn’t help, then it’s probably time for the “in case of emergency, break glass” method, meaning looking at your code. Please check your DMs for a message from me in that case. :nerd_face:

1 Like