W1, A2, part3 optimize

while coding my all test cases passed but grader failed that part, what to do ??

1 Like

The test cases in the notebooks don’t catch all possible errors. E.g. one way to get this scenario is to reference the global variable that is being passed to your function as a parameter, rather than using the formal parameter name to reference it within the body of the function. Or making some hard-coded assumption about the size or shape of one of the inputs that happens to match the test data in the notebook test case, but the grader uses an object with different dimensions.

1 Like

I have the same problem, part 4, which depends on optimize, passes as well

1 Like

Hi, Cornel.

Can you please be a bit more specific? What is the error message that you are seeing? Is it only from the grader? Do all the tests in the notebook pass?

1 Like

Same problem (getting 75/100), describing in detail.

The grader output:

Code Cell UNQ_C1: Function ‘clip’ is correct.
Code Cell UNQ_C2: Function ‘sample’ is correct.
Code Cell UNQ_C3: Unexpected error (KeyError(‘da_next’)) occurred during function check. We expected function optimize to return Test 3 failed. Please check that this function is defined properly.
Code Cell UNQ_C4: Function ‘model’ is correct.
If you see many functions being marked as incorrect, try to trace back your steps & identify if there is an incorrect function that is being used in other steps.
This dependency may be the cause of the errors.

I wouldn’t characterize this error message as particularly readable.

My optimize code (it’s really difficult to err):

{moderator edit - solution code removed}

It passes the tests (coincides with the expected output):

Loss = 126.50397572165389
gradients[“dWaa”][1][2] = 0.19470931534713587
np.argmax(gradients[“dWax”]) = 93
gradients[“dWya”][1][2] = -0.007773876032002162
gradients[“db”][4] = [-0.06809825]
gradients[“dby”][1] = [0.01538192]
a_last[4] = [-1.]
All tests passed!

Apart of functions supplied by you, optimize uses clip from Exercise 1.
My implementation:

{moderator edit - solution code removed}

Tests:

Gradients for mValue=10
gradients[“dWaa”][1][2] = 10.0
gradients[“dWax”][3][1] = -10.0
gradients[“dWya”][1][2] = 0.2971381536101662
gradients[“db”][4] = [10.]
gradients[“dby”][1] = [8.45833407]
All tests passed!

Gradients for mValue=5
gradients[“dWaa”][1][2] = 5.0
gradients[“dWax”][3][1] = -5.0
gradients[“dWya”][1][2] = 0.2971381536101662
gradients[“db”][4] = [5.]
gradients[“dby”][1] = [5.]
All tests passed!

1 Like

Ok, this is an interesting one. The reason that the grader fails does have to do with your clip function. You could argue that the problem is not really a bug in your implementation, but the grader thinks it is. Here’s what happened:

You changed the template code in clip and used a more sophisticated implementation than what they were suggesting. The way your implementation works is that it preserves and clips all entries in the input gradients dictionary, even if they aren’t in the explicit named list of gradients that they gave you in the template code. So if you do it their way and pass in a dictionary that has an extra entry for the key “da_next”, then their version of the code would strip out that entry and the returned dictionary would not contain it. But your version of the code would preserve and clip that entry. Apparently the grader is checking for gradients["da_next"] and expects that not to be found. If it is found, it considers your code incorrect.

I know this sounds weird, but I have confirmed the behavior. Now the question is whether that grader test is legitimate or not. Even if you end up winning that argument, it’s going to take weeks at best to get this resolved and get the grader changed. From a practical standpoint, I suggest you either rewrite your logic to the simpler way or you could even confirm the behavior by deleting that extra key from the dictionary in your clip logic and see that it then passes the grader.

2 Likes

I added logic to conditionally delete that “extra” key in the clip logic and then I can pass the grader for optimize with your version of the clip logic.

1 Like

Thank you very much for this thorough reply!
It’s interesting that clip itself passes the grader.
Anyway, I changed my clip implementation (saw you next answer only later), and now the whole assignment passes the grader.

1 Like

I’m glad to hear that you were able to get the full score! Yes, I will file a bug about this. If they are going to make this requirement, then it makes more sense to test for it in the clip function. But the larger question is why they care about this at all. I must be missing something. :nerd_face:

1 Like

You saved me, thanks a lot! This wired behavior has not changed untill now.

1 Like