Gradient checking assignment, np.linalg.norm problem

Hi,

I am having a problem with the numpy linalg. When i run the assignment on my own platform, everything is fine. but not when i run the notebook, getting the error below. i am using numpy version 1.19.5


AssertionError Traceback (most recent call last) in
4 gradients = backward_propagation_n(X, Y, cache)
5 difference = gradient_check_n(parameters, gradients, X, Y, 1e-7, True)---->
6 assert not(type(difference) == np.ndarray), “You are not using np.linalg.norm for numerator or denominator”
7
8 gradient_check_n_test(gradient_check_n, parameters, gradients, X, Y)

AssertionError: You are not using np.linalg.norm for numerator or denominator

Hi, @don.m.

Were you able to fix it?

gradient_check_n should return a scalar, not an array :slight_smile: