In week 2 of NLP Specialization, in the decoder layer, self.mha doesn’t return 2 values.
mult_attn_out1, attn_weights_block1 = self.mha1(x, x, x, look_ahead_mask)
gives the error
not enough values to unpack (expected 2, got 1)
In week 2 of NLP Specialization, in the decoder layer, self.mha doesn’t return 2 values.
mult_attn_out1, attn_weights_block1 = self.mha1(x, x, x, look_ahead_mask)
gives the error
not enough values to unpack (expected 2, got 1)
Hello @TANISHA_KAUR,
Firstly, a gentle reminder that sharing graded code directly is not in line with our community guidelines. We always aim to maintain the integrity and fairness of our learning environment.
Regarding the issue you’re facing in the NLP with Attention Models Week 2 assignment: When you’re invoking self.mha1
, it’s crucial to include an additional argument return_attention_scores
, setting it to True
. This modification should resolve the error you’re encountering.
Your current line of code is leading to the error “not enough values to unpack (expected 2, got 1)” because self.mha1
is not returning the expected two values.
By updating the method call to include return_attention_scores=True
, you should receive both the multi-head attention output and the attention weights, which is what you’re trying to unpack.