Course5 Week2 Grading Issue / assertion error

Dear colleagues;

My def complete_analogy_test(target) function throws assertion error for

assert(target(‘a’, ‘c’, ‘a’, word_to_vec_map) == ‘c’) line. Since the test is unsuccesfull and cell can not be run. Then notebook is graded -50 points. Anybody knows how to solve the problem?

If I modify the function assert(target(‘a’, ‘c’, ‘a’, word_to_vec_map) != ‘c’) to fix the problem ,system doesn’t accept it.

Hi @bullor

The assertion error is raised because your code is not returning the value as expected. If there is problem in the code, then debugging is needed. Start checking if the correct parameters are being passed to the cosine_similarity() function, assuming your word embeddings are done correctly, then check if you have done the max cosine update in the right way.

1 Like

Do not modify the asserts. The issue is in your code, not the test.

Hi Kic,

Thank you for your support. I checked my cosine_similarity function and verified from other community replies.

cosine_sim = cosine_similarity(np.subtract(e_b,e_a), (word_to_vec_map[w] - e_c))

I can’t see what is wrong with code. Would you please help me check?

Thanks .

Hi @bullor ,

The parameters passed to cosin_similarity() looks fine. It could be something very simple that is causing the problem. Could you try :

  1. restart kernel and clear output
  2. run all code cells above

Hi,

I am still at same place.

My notebook is still not working properly because of tests although the cosine similarity is correct. Anyone can help?
Thanks

Hi @bullor

I have checked the latest assignment, it appears that your assignment is different, in particular, the instruction for coding the loop :

to avoid best_word being one the input words, skip the input word_c

skip word_c from query

However, the comment instruction from your version of the assignment is:

to avoid best_word being one of the input words, pass on them.

So your implementation of skipping all input words, resulted in the assertion error for the last test.

This section of the coding exercise has not been changed for at least over a year since I last worked on it. So I don’t know how you could have a different assignment.

Hi Kic,

Thanks for your message. If my instruction is like ‘to avoid best_word being one of the input words, pass on them.’ why error prompts? Code should be correct based on this version.

As you can see, although I click the latest version and refreshed the lab, there is still the false statement

‘to avoid best_word being one of the input words, pass on them.’

Can you help me to get the unified version ( or your version ) in my lab ?

Dear Kic,

I fixed the problem by reverting it to the lst checkpoint and retyping line by line.
Now it is working.
Thank you for your support.

Hi @bullor

Great to hear the problem is now solved.

Hi Kic ,

In course5 last asignment I have page error as follows.My lab ID is wzzxwwkh.
Can you help me to solve it by reinitiate it?

Hi Kic,

I could reinitiated it. That problem is fixed. But now I have another problem for Course5 Last assignment # UNQ_C6.

self.mha2(enc_output, enc_output, Q1, padding_mask, return_attention_scores=True)

Above function throws error while being tested. Arguments for mha2 should be K, Q and V values. Since I dont know how to explicitly define K and V , I just put enc_output. But I think that is the root cause of the problem.

The error I get is, AssertionError: Wrong values in attn_w_b2. Check the call to self.mha2

Could you please help me to fix that ?

Thank you.

Hi @bullor ,

Here is the instruction from the assignment on how to code block2:

BLOCK 2

    # calculate self-attention using the Q from the first block and K and V from the encoder output. 
    # Dropout will be applied during training
    # Return attention scores as attn_weights_block2 (~1 line) 

It looks like you have the parameters passed to the self.mha2() incorrectly. It should be (Q1, enc_output, enc_output, padding_mask, return_attention_score=True)

Thanks Kic,

That resolved my issue.