Week 1 / Exercise 4 - gradient_check_n

Hello again, I’m on the last exercise (Exercise 4) of Week 1 (Course 2). Regarding the function

{moderator edit - solution code removed}

but it complained

IndexError: only integers, slices (:), ellipsis (...), numpy.newaxis (None) and integer or boolean arrays are valid indices

It seems like it’s not happy with theta_plus[i] but I don’t see anything wrong with the way I did it.

Hi @Khiem_Viet_Ngo , have a look at the instructions, which say to use:

Calculate Ji+ using to forward_propagation_n(x, y, vector_to_dictionary( θ+ `))

Thanks SJ. Somehow, I thought “vector_to_dictionary” as a recommendation, it’s actually a method imported from the gc_utils package. However, even after I made this change so that

J_plus[i], cache = forward_propagation_n(X, Y, vector_to_dictionary(theta_plus[i]))

Now, the error becomes:

ValueError: cannot reshape array of size 1 into shape (5,4)

The problem is you are passing only one element of theta_plus to the function. That is not what is intended. If that statement is not enough to get you there, try looking at the source code for vector_to_dictionary.

1 Like

Thanks Paul! The cost function is supposed to be

J_plus[i], cache = forward_propagation_n(X, Y, vector_to_dictionary(theta_plus))

When I re-ran my code, it said There is a mistake in the backward propagation! difference = 0.2850931567761623

However, in the backward_propagation function, I had

dtheta = x

and I got “All tests passed”, and the message

Your backward propagation works perfectly fine! difference = 2.919335883291695e-10

So, what is exactly wrong here ?

There are two separate sections here: the 1D case and the N dimensional case. So which message is from which section?

With correct code, I get this for the gradient_check output (the 1D case):

Your backward propagation works perfectly fine! difference = 2.919335883291695e-10

For gradient_check_n, here is the output if I do not fix the intentional errors they give you in the back propagation logic:

There is a mistake in the backward propagation! difference = 0.2850931567761623

And here is what I get once I fix the intentional bugs:

Your backward propagation works perfectly fine! difference = 1.1890913024229996e-07

I fixed it, thanks! I thought I was supposed to make correction only to the code I wrote but, in this case, I had to make correction to the given code. Got it!

Great! Please note that they explained all that in the instructions. Here’s the relevant paragraph:

It seems that there were errors in the backward_propagation_n code! Good thing you’ve implemented the gradient check. Go back to backward_propagation and try to find/correct the errors (Hint: check dW2 and db1) .

I meet the same question. Thx for answer!