# Gradient_check_n outputs difference = 1.0 and I dont know why

Hi AI community
I am working on the last programming assignment Gradient_Checking and I have been stuck for hours on the last jupyter cell. I have implemented everything as described but can not get the right result. Any ideas? I am getting this:

Thank you so much for your help!

Mistakes:

1. In backward_propagation_n, dW2 and db1 are incorrect.
2. In gradient_check_n, theta_minus[i] is assigned incorrect value. (See the word minus)

Hereâ€™s some text from the markdown youâ€™ll find useful:

seems that there were errors in the backward_propagation_n code! Good thing youâ€™ve implemented the gradient check. Go back to backward_propagation_n and try to find/correct the errors (Hint: check dW2 and db1). Rerun the gradient check when you think youâ€™ve fixed it. Remember, youâ€™ll need to re-execute the cell defining backward_propagation_n() if you modify the code.

Hi and thank you for your feedback

can you please show me what the correct assignment for theta_minus[I] should look like? I donâ€™t see any problem there.

Sure. See this markdown text:

• To compute J_plus[i]:
1. Set \theta^{+} to np.copy(parameters_values)
2. Set \theta^{+}_i to \theta^{+}_i + \varepsilon
3. Calculate J^{+}_i using to forward_propagation_n(x, y, vector_to_dictionary(\theta^{+} )).
• To compute J_minus[i]: do the same thing with \theta^{-}

The last point asks the learner to follow a similar line of thought for theta_minus and fix the sign of epsilon accordingly.

ohh, I see it now, I did not put the minus before epsilon thank you so much for your help. It is correct now