Getting 'inf' cost for Adam's implementation Week_2 Assignment

Hi,
I am done with most of the week 2 assignment. Mini-Batch Gradient Descent with Momentum and Mini-Batch Gradient Descent is given the expected results. But when I run Mini-Batch with Adam I get 50% accuracy and total cost as inf. The cell which tests the correctness of the Adams implementation passes all the test cases and gives the expected results.

I have attached some screenshots.



Thank You

I get a different value even for the cost at iteration 0 when I run that test cell:

Cost after epoch 0: 0.702166
Cost after epoch 1000: 0.167845
Cost after epoch 2000: 0.141316
Cost after epoch 3000: 0.138788
Cost after epoch 4000: 0.136066

So something fundamental must be wrong. For starters, are you running this on the course website or in some other environment?

If you are running on the course website, it might be good as a sanity check just to make sure everything is in a consistent state before we investigate further. Try this sequence:

  1. Kernel → Restart and Clear Output
  2. Save
  3. Cell → Run All

Then check the output of that “training with Adam” cell and see if you still are getting the NaN cost values.

I am using the course website.
I tried this. still getting the same issue

Please check your DMs for a message from me about how to proceed here.

Just to close the loop here on the public thread: we had a private conversation involving the actual code and there was a bug in the implementation that turns out not to be caught by the test cases in the notebook. I’m investigating a bit further to see how complicated it would be to enhance the test cases to catch that bug.