romain
1
Hi !
I’ve this error.
I put something like that :
attn1, attn_weights_block1 = self.mha1(x, x, x, look_ahead_mask,return_attention_scores=True) # (batch_size, target_seq_len, d_model)
attn2, attn_weights_block2 = self.mha2(enc_output, enc_output, out1, padding_mask,return_attention_scores=True) # (batch_size, target_seq_len, d_model)
Do you have some tips to fix this error ?
Thanks !
TMosh
2
It would help if you specify which function you’re working on. I’m going to assume it is the “call” method of DecoderLayer().
Your arguments for mha2(…) are not correct.
2 Likes
romain
3
Yes it’s the call function of the block “decoder”.
Thanks for your help !
Hi ,
I am facing same issue in Call Function in UNQ_C6 ,
AssertionError Traceback (most recent call last)
in
1 # UNIT TEST
----> 2 DecoderLayer_test(DecoderLayer, create_look_ahead_mask)
~/work/W4A1/public_tests.py in DecoderLayer_test(target, create_look_ahead_mask)
179
180 assert np.allclose(attn_w_b1[0, 0, 1], [0.5271505, 0.47284946, 0.], atol=1e-2), “Wrong values in attn_w_b1. Check the call to self.mha1”
→ 181 assert np.allclose(attn_w_b2[0, 0, 1], [0.32048798, 0.390301, 0.28921106]), “Wrong values in attn_w_b2. Check the call to self.mha2”
182 assert np.allclose(out[0, 0], [-0.22109576, -1.5455486, 0.852692, 0.9139523]), “Wrong values in out”
183
AssertionError: Wrong values in attn_w_b2. Check the call to self.mha2
I am having below calls-
{mentor edit: code removed}
Can you please help to guide where I am going wrong.
Thank you.
I’m having the same issue. I have tried similar code to yours, but I also replaced
self.mha2(mult_attn_out1 by {mentor edit: code removed}
where Q1 = {mentor edit: code removed}
Any suggestions appreciated!
Edit: Figured it out. I needed: {mentor edit: code removed}