Week1 Ex4 Gradient_check_n

For some reason my difference is 0.999999 instead of the expected 0.28. I read that the issue is adding epsilon in theta minus but that is not the case in my code. Pls help
My theta plus/minus is being calculated as:
theta_plus[i]=theta_plus[i] + epsilon
theta_minus[i]=theta_minus[i] - epsilon

I realized that I was not dividing J_plus and J_minus by 2epsilon and after doing that my error is now much lower, but it is not the expected value of 0.28 .
Below is my new value

There is a mistake in the backward propagation! difference = 2.4577112074727173e-07

Well, there are two expected values: one if you’ve fixed the intentional bugs in back prop and one if you haven’t. Your value is a little bigger than 2x the correct value if you have fixed the intentional back prop bugs.

Dunno if that’s enough of a clue to help you find the bug, though.

Hi @paulinpaloalto . I haven’t fixed the bug though.