Week4 (Transformer Network), Exercise 4

Hello,

I am getting below error when trying to do the step ‘apply dropout layer to the self-attention output’. I am using self.dropout1 and passing along the self_attn_output from the previous step and the training parameter. Don’t understand what I am doing wrong Please help.

“Attempt to convert a value (<tensorflow.python.keras.layers.multi_head_attention.MultiHeadAttention object at 0x7feb0816e850>) with an unsupported type (<class ‘tensorflow.python.keras.layers.multi_head_attention.MultiHeadAttention’>) to a Tensor.”

Are you using “training = training”?
Perhaps the problem could be with your code that computes the self_attn_output.

Hello Tom ,

I just resolved this, yes, the problem was with the step before it using the ‘mha’. Thank you.