Gradient_check give me error with wrong value output

The mistake it refer to as
'There is a mistake in the backward propagation! difference = 0.33333334789859204
Also the following
"AssertionError Traceback (most recent call last)
in
6 expected_values = [0.2850931567761623, 1.1890913024229996e-07]
7 assert not(type(difference) == np.ndarray), “You are not using np.linalg.norm for numerator or denominator”
----> 8 assert np.any(np.isclose(difference, expected_values)), “Wrong value. It is not one of the expected values”

AssertionError: Wrong value. It is not one of the expected values"
This is regarding the gradient_check function, I have read the instruction and changed dW2, db1 but still not getting the right answer, so any hint or help along the direction would be really helpful.

Hi @GGT_deeplearning,

Can you share your Lab ID with me ? Thanks. I’ll take a look.

I am not sure where is my lab id, but I am copy pasting the url of the notebook, my problem is with the last cell Gradient_ check. Right now I have worked it out submitted, but it’s only guess, my understanding is not agreeing with what I see in lecture.

So theta is vector having all the weights and bias so theta[i] should be a number ? I guess it’s treating it as a vector ?

Hi @GGT_deeplearning, are still stuck here ?

In your assignment, on the top right there’s a “Help” option. Upon clicking it a panel will open, your lab ID will be shown at the bottom.

(When you reply, kindly tag me in it)

Did you have a clue why this issue came up?


I have this error and still unable to fix it. I have fixed the fake values included in the dW2 and dB3 but still unable to pass through in the implementation and hence couldn’t reach the pass mark. Attached is the error screenshot.