W1 Gradient Checking E4

I’m getting the following error, but I have no idea what it means.

---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
<ipython-input-27-0b57cb811a44> in <module>
      3 cost, cache = forward_propagation_n(X, Y, parameters)
      4 gradients = backward_propagation_n(X, Y, cache)
----> 5 difference = gradient_check_n(parameters, gradients, X, Y, 1e-7, True)
      6 assert not(type(difference) == np.ndarray), "You are not using np.linalg.norm for numerator or denominator"
      7 

<ipython-input-26-6d080e0c5c17> in gradient_check_n(parameters, gradients, X, Y, epsilon, print_msg)
     38         theta_plus = np.copy(parameters_values)
     39         theta_plus[i] = theta_plus[i] + epsilon
---> 40         J_plus[i] = forward_propagation_n(X, Y, vector_to_dictionary(theta_plus))
     41 
     42         # YOUR CODE ENDS HERE

ValueError: cannot copy sequence with size 2 to array axis with dimension 1
3 Likes

forward_propagation_n(X, Y, parameters) returns cost, cache, so you need to take that into consideration :slight_smile:

6 Likes