Gradient_check_n outputs difference = 1.0 and I dont know why

Hi AI community :slight_smile:
I am working on the last programming assignment Gradient_Checking and I have been stuck for hours on the last jupyter cell. I have implemented everything as described but can not get the right result. Any ideas? :slight_smile: I am getting this:

Thank you so much for your help!

Please click my name and message your notebook as an attachment.

Mistakes:

  1. In backward_propagation_n, dW2 and db1 are incorrect.
  2. In gradient_check_n, theta_minus[i] is assigned incorrect value. (See the word minus)

Here’s some text from the markdown you’ll find useful:

seems that there were errors in the backward_propagation_n code! Good thing you’ve implemented the gradient check. Go back to backward_propagation_n and try to find/correct the errors (Hint: check dW2 and db1). Rerun the gradient check when you think you’ve fixed it. Remember, you’ll need to re-execute the cell defining backward_propagation_n() if you modify the code.

Hi and thank you for your feedback :slight_smile:

can you please show me what the correct assignment for theta_minus[I] should look like? I don’t see any problem there.

Sure. See this markdown text:

  • To compute J_plus[i]:
    1. Set \theta^{+} to np.copy(parameters_values)
    2. Set \theta^{+}_i to \theta^{+}_i + \varepsilon
    3. Calculate J^{+}_i using to forward_propagation_n(x, y, vector_to_dictionary(\theta^{+} )).
  • To compute J_minus[i]: do the same thing with \theta^{-}

The last point asks the learner to follow a similar line of thought for theta_minus and fix the sign of epsilon accordingly.

ohh, I see it now, I did not put the minus before epsilon :slight_smile: thank you so much for your help. It is correct now :slight_smile: