All tests passed. The assignment did not

I have completed C2-W1’s last assignment, all test passed (plus, fixed the backdrop errors) - but the grader gives 60/100 and the assignment did not pass.

The grader feedback should tell you which functions were judged to be incorrect.

The feedback was not of much help: “[ValidateApp | INFO] Validating ‘/home/jovyan/work/submitted/courseraLearner/W1A3/Gradient_Checking.ipynb’
[ValidateApp | INFO] Executing notebook with kernel: python3
Tests failed on 1 cell(s)! These tests could be hidden. Please check your submission.”

Yikes! I agree that grader feedback doesn’t give you anything to go on. It should at least mention which function failed. I’ll report that as a bug.

I checked the grader test cases and it looks like they require you to use np.linalg.norm to calculate the gradient error in the “n dimensional” case, but neither the notebook nor the grader errors tell you that. Are you sure that you used that function?

1 Like

Thank you, Paul. It did work with linalg.norm. Appreciated.

I am having the same issue, even though I have used np.linalg.norm() and all test cases have been passed , I am still getting a grade of 60/100 on submitting. Can someone help me?

Hi Paul, i have the same issue with the 60/100 grade and the same error message. I did use np.linalg.norm(), can it be something else? Thank you!

Here’s another parallel thread with more suggestions for things to check.

1 Like

If i’m not mistaken, it is not the same problem. The original problem in that thread was the np.linalg.norm missing. I do have it. Other mistakes in the thread have errors in the tests (which i don’t have, or at least none are shown to me).

1 Like

I am receiving the same error and am stuck at 80/100. I did use np.linalg.norm() and received the “all tests passed” feedback. Does this bug still persist, or is there an error in my code somewhere?

Hi, Elias.

The grader sometimes gives feedback that doesn’t tell you which functions are wrong. But if you get < 100 points, then there is a bug in your code. The problem is just that you don’t know which function to look at. Notice that in the first 1D version of gradient_check, there is no real test case in the notebook and they also don’t show you what the output should look like. Here’s my output for that test cell:

4.0000001 3.9999999 8.0000002 7.9999998 2.0000000011677344
2
1.167734353657579e-09
4.000000001167734
Your backward propagation works perfectly fine! difference = 2.919335883291695e-10

Note that I added some debugging prints to my code. The main thing to check is the 2.919e-10 number. If you got something different, then that’s where the bug is.

2 Likes

That was it, thanks! I had a simple error in my denominator function, but the feedback said that it was working fine anyway.

It’s great news that you were able to find the solution under your own power. Onward! :nerd_face:

Thank you.
I forgot to use np.linalg.norm() in Exercise3(1D version of gradient check).
Since there was no real test case although the result was nan (instead of 0.0) it seemed passed.