Natural Language Processing with Attention Models C4W1_Assignment Exercise 2

I guess this is the call of CrossAttention layer (exercise 2!), in the comments it says:

‘# Call the MH attention by passing in the query and value
# For this case the query should be the translation and the value the encoded sentence to translate
# Hint: Check the call arguments of MultiHeadAttention in the docs’

You are not supposed to add them (target and context) but pass them as arguments to the mh attention layer as mentioned in the comments!

Have a look at the call arguments: