Gradient_check_n : Wrong Value


Hi ! I am actually stuck in the gradient checking.
I have already check that:

  • theta_plus and theta_minus are the same copy of parameter_values with respectively + epsilon and - epsilon
  • J_plus[i], _ is equal to the forward_propagation_n of X, Y and theta_plus (theta_minus for J_minus)
  • gradapprox[i] is inside the for loop and contains J_plus[i] and J_minus[i]
  • used np.linalg.norm for the difference.

Do you have any insight ? Thx

Ok I solve it. I just misunderstood the “2” for each norm in the formula. Wrote a power of 2 for each norm.

Hi @Quentin,

Just to clarify, 2 here refers to L2 norm. Glad you were able to debug it!