Course 2 week 1 gradient_check_n error

The error:
ValueError Traceback (most recent call last)
3 cost, cache = forward_propagation_n(X, Y, parameters)
4 gradients = backward_propagation_n(X, Y, cache)
----> 5 difference = gradient_check_n(parameters, gradients, X, Y, 1e-7, True)
6 expected_values = [0.2850931567761623, 1.1890913024229996e-07]
7 assert not(type(difference) == np.ndarray), “You are not using np.linalg.norm for numerator or denominator”

in gradient_check_n(parameters, gradients, X, Y, epsilon, print_msg)
37 theta_plus = np.copy(parameters_values)
38 theta_plus[i] = theta_plus[i]+epsilon
—> 39 J_plus[i]=forward_propagation_n(X, Y, vector_to_dictionary(theta_plus[i] ))

~/work/release/W1A3/ in vector_to_dictionary(theta)
53 “”"
54 parameters = {}
—> 55 parameters[“W1”] = theta[: 20].reshape((5, 4))
56 parameters[“b1”] = theta[20: 25].reshape((5, 1))
57 parameters[“W2”] = theta[25: 40].reshape((3, 5))

ValueError: cannot reshape array of size 1 into shape (5,4)

The code:

{moderator edit - solution code removed}

Updated code:

{moderator edit - solution code removed}

You are just passing one element of theta_plus to forward propagation. You are supposed to be passing the whole vector, right?


Hi, I seem to be having the same error even though I have the following code:

{moderator edit - solution code removed}

Am I missing something in my logic or syntax or just wrting a wrong piece of code?

You are only copying one element of parameters_values to initialize theta_plus. Don’t you need the entire array? I would expect it to just throw an error on the second line as soon as i > 0. The point is you are “tweaking” a different element in each iteration as you march through all the elements of the full array.

Hi @paulinpaloalto,
You are absolutely correct!! I corrected the code and received the proper output. Thank you so much!!! The logic you provided is understood, I seemed to be overlooking that point in the code and verifying just about everything else? :joy: Thank you for the help, have a great day! :nerd_face: