DLS W2 Gradient_Checking

Hi,

I get this error;

“There is a mistake in the backward propagation! difference = 0.6623181774407311”.

I have triplechecked my code and i just cant figure out why.

{moderator edit - solution code removed}

On another note, does anyone have a link to the topic of gradient checking with real numbers so that it becomes less abstract :slight_smile:

Please compare your code to the formula for gradapprox. In that line, you should not be using the \theta values, but the J values, right? That is J_plus and J_minus. This also looks like the n-dimensional case. Did you pass the tests for the 1D case? That part of the code should look similar in both cases, although how you get J_plus and J_minus is a bit more complicated in the n-dimensional case.

If this is not making sense to you, my suggestion would be to watch the lectures again and read carefully through the explanations in the notebook.

Also notice that you filed this under MLS Course 2, but the title refers to DLS. Was that intentional? If not, you can more the thread by using the little “edit pencil” on the title.

Thanks, got it!

It totally makes sense conceptually but generally seeing “real numbers” would make a huge difference - at least for me.