W1 A3 Gradient Checking (EX 4)

Hi

I’m facing an error trying to implement the last exercise of the gradient checking assignment.
I get the following error:


ValueError Traceback (most recent call last)
in
3 cost, cache = forward_propagation_n(X, Y, parameters)
4 gradients = backward_propagation_n(X, Y, cache)
----> 5 difference = gradient_check_n(parameters, gradients, X, Y, 1e-7, True)
6 expected_values = [0.2850931567761623, 1.1890913024229996e-07]
7 assert not(type(difference) == np.ndarray), “You are not using np.linalg.norm for numerator or denominator”

in gradient_check_n(parameters, gradients, X, Y, epsilon, print_msg)
37 theta_plus = np.copy(parameters_values)
38 theta_plus[i] = theta_plus[i] + epsilon
—> 39 J_plus[i] = forward_propagation_n(X,Y, vector_to_dictionary(theta_plus))
40 # YOUR CODE ENDS HERE
41

ValueError: cannot copy sequence with size 2 to array axis with dimension 1

However, I don’t know what I am doing wrong.

Can someone help me please?

The pre-written code given to you is J_plus[i], _ . You changed it to J_plus[i] and, I guess, this caused you the error.

Thank you! My code works now.