Week 1, Programming assignment 2, graded ex. 3 issue

Hello.
I get the message: Code Cell UNQ_C3: Unexpected error (KeyError(‘da_next’)) occurred during function check. We expected function optimize to return Test 3 failed. Please check that this function is defined properly.
Some post said that this means that some global variable somewhere is defined wrongly. But I do not have any da_next or global variables… And I run from scratch my notebook and pass all the tests but grader fails my UNQ_C3 test. Can you give a hint how to debug? Thank you

The lack of a screen capture image about your error is notable.

Here you go, TMosh. Thank you for your help.


Thanks. The reason I wanted the screen capture was to verify what the source of the error was (the grader or the notebook’s built-in tests).

Text copy-and-paste rarely contains all of the context a mentor needs.

Will reply more shortly.

See this linkhttps://community.deeplearning.ai/t/w1-a2-part3-optimize/242726/6?u=paulinpaloalto

Hi Paul,
Thanks for your point. I read your post from long time ago:

Ok, this is an interesting one. The reason that the grader fails does have to do with your clip function. You could argue that the problem is not really a bug in your implementation, but the grader thinks it is. Here’s what happened:

You changed the template code in clip and used a more sophisticated implementation than what they were suggesting. The way your implementation works is that it preserves and clips all entries in the input gradients dictionary, even if they aren’t in the explicit named list of gradients that they gave you in the template code. So if you do it their way and pass in a dictionary that has an extra entry for the key “da_next”, then their version of the code would strip out that entry and the returned dictionary would not contain it. But your version of the code would preserve and clip that entry. Apparently the grader is checking for gradients["da_next"] and expects that not to be found. If it is found, it considers your code incorrect.

I know this sounds weird, but I have confirmed the behavior. Now the question is whether that grader test is legitimate or not. Even if you end up winning that argument, it’s going to take weeks at best to get this resolved and get the grader changed. From a practical standpoint, I suggest you either rewrite your logic to the simpler way or you could even confirm the behavior by deleting that extra key from the dictionary in your clip logic and see that it then passes the grader.

and frankly do not understand what you mean. For clip function I try both:

for gradient_key in gradients:
    gradients[gradient_key] = np.clip(gradients[gradient_key], -maxValue, maxValue, out=gradients[gradient_key])

and

for gradient_key in gradients:
     gradients[gradient_key] = np.clip(gradients[gradient_key], -maxValue, maxValue, out=None)

and both pass immediate jupyter testing, but fail grader in the same way.

Please read my previous post again with a bit more discernment. The key point is that the grader for optimize expects the key da_next not to be present in the dictionary after the call to clip. With your implementation of clip, if da_next is present in the dictionary that is input to clip, will it also be present in the output? The grader expects that not to be true and if you used the template code that they gave you it would not be true and you’d pass the grader. But you clearly rewote the template code that they gave you. So you have two choices:

  1. You can revert to the way the template code worked.
  2. You can add the logic to delete the key da_next from the output dictionary if it is found.

Also note if you stick with your more sophisticated version, then the correct value for the out argument is out = gradient.

Sorry, it’s been a while since I read the template code for clip. Here it is:

# UNQ_C1 (UNIQUE CELL IDENTIFIER, DO NOT EDIT)
### GRADED FUNCTION: clip

def clip(gradients, maxValue):
    '''
    Clips the gradients' values between minimum and maximum.
    
    Arguments:
    gradients -- a dictionary containing the gradients "dWaa", "dWax", "dWya", "db", "dby"
    maxValue -- everything above this number is set to this number, and everything less than -maxValue is set to -maxValue
    
    Returns: 
    gradients -- a dictionary with the clipped gradients.
    '''
    gradients = copy.deepcopy(gradients)
    
    dWaa, dWax, dWya, db, dby = gradients['dWaa'], gradients['dWax'], gradients['dWya'], gradients['db'], gradients['dby']
   
    ### START CODE HERE ###
    # Clip to mitigate exploding gradients, loop over [dWax, dWaa, dWya, db, dby]. (≈2 lines)
    for gradient in None:
        np.clip(None, None, None, out = None)
    ### END CODE HERE ###
    
    gradients = {"dWaa": dWaa, "dWax": dWax, "dWya": dWya, "db": db, "dby": dby}
    
    return gradients

So even if you do your version where you use gradients.keys() as the range in the loop, instead of the explicit list of the gradients you care about as they suggested in the comment, you still would have been ok, if you had left that last assignment statement that creates a new version of gradients in place. But you must also have deleted that.

I don’t disagree that this is all a bit of a distraction and they shouldn’t have written the grader code to care about that, since it doesn’t really matter. I think I filed a bug about this a while back and it either got rejected or shelved into the “maybe someday” category. But to be fair to the course staff, you did go out of your way to step on this landmine and it’s only happened a few times historically, so maybe fixing it is not worth it. :nerd_face:

Well, I just played stupid and added those lines:
if ‘da_next’ in gradients:
del gradients[‘da_next’]

and it worked. Thanks, Paul!

Is that a good solution?

I think it is not a problem. After all, it seems that the grader has a bug, and my “solution” allows to negate it rather than fix it by OpenAI staff.

But the higher level point is that you could have left the template code as it was and then your mystery code would not have been required. So there was a simpler solution, which is how I interpret Tom’s question.

Yes, it’s arguably a bug in the grader, but you only hit it because you chose to over-ride the instructions they gave you.

The grader doesn’t have a bug. It just didn’t like your code.

OpenAI staff do not work here.

I tried to do this, which is following original instructional code, but still got da_next error; then I just “gave up” and manually deleted like I said above…

for gradient in gradients.values():
    np.clip(gradient, -maxValue, maxValue, out=gradient)

The question is, how did it appear such that you needed to delete it?

I still do not get Paul’s point “but you only hit it because you chose to over-ride the instructions they gave you.”

When I do just this:

for gradient in gradients.values():
np.clip(gradient, -maxValue, maxValue, out=gradient)

as I understand I still follow instructions. But what I do is not enough. So I do not know how to fix the issue “in a correct way”.

Did you get a fresh copy of the assignment and look at the original template code? You deleted the key last line before the return statement. That is the sense in which you did not follow their instructions. How you write the loop essentially doesn’t matter .