Week 1: Gradient Checking’s Last exercise error Deep Learning Specialization Improving Deep Neural Networks: Hyperparameter tun

I have passed all the test but even submitting multiple time, my grades are not getting updated. Even I know there is mistake in backward_propagation_n i.e. in dW2 a factor of 2 is included while db1 has 4 as multiplier which should not be used. I can get the expected output and other output by respectively including these two factors.

Initially I got 60% grades but my grades are not improving even in validation it says that your note book has passed all the test. I am attaching all the relevant screenshots.

Please help

Regards

Dr. Baij Singh

If you pass the tests in the notebook, but fail the grader, it typically means that you have written your code in a way that is not “general”: it works for one test, but fails for a different test. It looks like your code for the “n dimensional” case is correct, so maybe the place to look is on the 1D code.

If this hint is not enough to help, then it’s probably time to look at your notebook. We can’t do that on a public thread, but check your DMs for a message from me about how to proceed with that.

Dear sir Paul,

I am thankful to your response.

Thanks and Regards

Dr. Baij Nath Singh

To close the loop on the public thread, the problem was the same one documented on this thread from back in June: if you cast the final difference value to float, then it passes in the notebook but fails the grader in an entirely unhelpful way. An enhancement request has already been filed requesting that they add a test in the notebook to catch this mistake in a transparent fashion.

1 Like

I moved the thread to the DLS course forum.