C5_W4_A1_Transformer_Subclass_v1 # BLOCK 2

in class DecoderLayer(tf.keras.layers.Layer): CLass


    # calculate self-attention using the Q from the first block 
    #and K and V from the encoder output.
    # Return attention scores as attn_weights_block2 (~1 line)
    attn2, attn_weights_block2 = self.mha2(enc_output,enc_output, out1, padding_mask, return_attention_scores=True) 

I get this error when I run the code any ideas? TIA

AssertionError: Wrong values in attn_w_b2. Check the call to self.mha2

You have the order of the arguments wrong.
(out1, enc_output, enc_output, padding_mask, return_attention_scores=True)