AssertionError Traceback (most recent call last)
in
1 # UNIT TEST
----> 2 DecoderLayer_test(DecoderLayer, create_look_ahead_mask)
~/work/W4A1/public_tests.py in DecoderLayer_test(target, create_look_ahead_mask)
179
180 assert np.allclose(attn_w_b1[0, 0, 1], [0.5271505, 0.47284946, 0.], atol=1e-2), “Wrong values in attn_w_b1. Check the call to self.mha1”
→ 181 assert np.allclose(attn_w_b2[0, 0, 1], [0.32048798, 0.390301, 0.28921106]), “Wrong values in attn_w_b2. Check the call to self.mha2”
182 assert np.allclose(out[0, 0], [-0.22109576, -1.5455486, 0.852692, 0.9139523]), “Wrong values in out”
183
AssertionError: Wrong values in attn_w_b2. Check the call to self.mha2
This is the error I am getting. I have checked the call specifically alongside all other areas of the code however I cannot see where I am going wrong. Please could someone suggest where I should try again to find the problem?
If this topic doesn’t help, please click my name and message your notebook as an attachment.
1 Like
Hello Shreyas,
Your first code self.mh1 seems to have problem based on the error log, can you sent this part of the code to my message inbox. Please click on my name and then select message, and share your code from the graded cell just above the error encountered in the image.
Regards
DP
that topic link is really worth reading once 
Regards
DP
Thank you for the response. I looked at the link however I have not been able to resolve my issue thus as advised I have sent my code. Please do let me know if you find an issue.
Thank you for your time.
Please look at the architecture of the decoder layer.
You have changed the comment manually which explains the additional dropout layers in your implementation. Dropout needs to be applied only once i.e. to the output of the feed forward network.
From starter code:
# BLOCK 1
# calculate self-attention and return attention scores as attn_weights_block1.
# Dropout will be applied during training (~1 line).
Yours:
# BLOCK 1
# calculate self-attention and return attention scores as attn_weights_block1 (~1 line)
# LINE OF CODE
# apply dropout layer on the attention output (~1 line)
# LINE OF CODE APPLYING DROPOUT
Please follow these steps to refresh your workspace if required. Change code at places only where required. See the section Important Note on Submission to the AutoGrader
in the notebook as well.
Thank you for resolving my issue. After refreshing my workspace it all became clear.
Shreyas looks like you were able to resolve your issue with the code??
Yes, and I completed the assignment thank you for your time.
I am facing the exact same issue, refreshing the notebook did not really help? @Deepti_Prasad?
@Karthik_Nagraj
Avoid posting your query on older threads, rather create new topic with a screenshot of the error you have encountered.
Regards
DP