C2W1 - Assert error on gradient check

Gradient Checking HW for Week 1 of "Improving Deep Neural Networks … "

In exercise 4, gradient_check_n I get the correct difference

“There is a mistake in the backward propagation! difference = 0.2850931567761623”

But the next assert

assert np.any(np.isclose(difference, expected_values))

throws an error

"TypeError: ufunc ‘isfinite’ not supported for the input types, and the inputs could not be safely coerced to any supported types according to the casting rule '‘safe’ "

I checked, and my difference value is a float.

I apologize ahead of time if this has already been answered somewhere.

1 Like

Please show us the full exception trace that you are getting. There must be something funny going on with the type of your difference value.

1 Like

Thanks for your consideration

1 Like

Interesting. I added print statements to my code as well and here’s what I see:

num_parameters 47
39: gradapprox[i] = [0.], grad[i] = [0.]
40: gradapprox[i] = [0.], grad[i] = [0.]
41: gradapprox[i] = [0.], grad[i] = [0.]
42: gradapprox[i] = [0.19763344], grad[i] = [0.19763343]
43: gradapprox[i] = [0.], grad[i] = [0.]
44: gradapprox[i] = [0.], grad[i] = [0.]
45: gradapprox[i] = [2.24404227], grad[i] = [2.24404238]
46: gradapprox[i] = [0.21225742], grad[i] = [0.21225753]
numerator = 8.050575492696896e-07
norm(grad) = 3.3851797873981373
norm(gradapprox) = 3.3851796259558395
denominator = 6.770359413353977
type(difference) = <class 'numpy.float64'>
difference = 1.1890913024229996e-07
Your backward propagation works perfectly fine! difference = 1.1890913024229996e-07

I had fixed the intentional bugs in back prop, so I get the other value. But the type of the difference return value is exactly the same as you show.

So I don’t have a theory about what happened here and I’ll need to see your notebook. We can’t do that in a public way here, but I will send you a DM about how to share code.

2 Likes

Thank you so much for your help

{moderator edit - solution code removed}

Your solution code is fine, but you accidentally modified the logic that was given to you in the template code in such a way that you literally don’t have a return statement in the function in the case that a mistake is detected. I sent you more details in a private DM message.

1 Like

Thank you so much. I know it must be a pain to find these bugs. I have no idea how I modified that cell.

All good now!

1 Like