Stuck at UNQ_C6 with the AssertionError: Wrong values in attn_w_b2. Check the call to self.mha2
mult_attn_out2, attn_weights_block2 = self.mha2(Q1,enc_output,enc_output,padding_mask, return_attention_scores=True
Stuck at UNQ_C6 with the AssertionError: Wrong values in attn_w_b2. Check the call to self.mha2
mult_attn_out2, attn_weights_block2 = self.mha2(Q1,enc_output,enc_output,padding_mask, return_attention_scores=True
Most likely causes:
mult_attn_out1, attn_weights_block1 = self.mha1(x, x, x, look_ahead_mask, return_attention_scores=True)
mult_attn_out1 = self.dropout_ffn(mult_attn_out1, training=training)
Q1 = self.layernorm1(mult_attn_out1 + x)
I haven’t modified any constructor functions in the DecodeLayer() and did not modify create_look_ahead_ask() function, so most likely the value for Q1 is incorrect, please review my code .
I solved the problem.