Course 2 Week 1 Gradient Checking Exercise 4

I’ve been stuck the Gradient Checking Exercise 4 for days. Can not understand what I’m doing wrong. Any guidance would be helpful.

ValueError Traceback (most recent call last)
in
3 cost, cache = forward_propagation_n(X, Y, parameters)
4 gradients = backward_propagation_n(X, Y, cache)
----> 5 difference = gradient_check_n(parameters, gradients, X, Y, 1e-7, True)
6 expected_values = [0.2850931567761623, 1.1890913024229996e-07]
7 assert not(type(difference) == np.ndarray), “You are not using np.linalg.norm for numerator or denominator”

in gradient_check_n(parameters, gradients, X, Y, epsilon, print_msg)
37 theta_plus = np.copy(parameters_values)
38 theta_plus[i] = theta_plus[i] + epsilon
—> 39 J_plus[i], _ = forward_propagation_n(X, Y, vector_to_dictionary(theta_plus[i]))
40 # YOUR CODE ENDS HERE
41

~/work/release/W1A3/gc_utils.py in vector_to_dictionary(theta)
53 “”"
54 parameters = {}
—> 55 parameters[“W1”] = theta[: 20].reshape((5, 4))
56 parameters[“b1”] = theta[20: 25].reshape((5, 1))
57 parameters[“W2”] = theta[25: 40].reshape((3, 5))

ValueError: cannot reshape array of size 1 into shape (5,4)

Thanks,
Allan

I fixed it. No need to reply

RIght: you’re passing only one element of the theta_plus vector. It’s one of those cases where the error message is actually pretty clear: how could theta have only 1 element at that point in the code?