# UNQ_C3 (UNIQUE CELL IDENTIFIER, DO NOT EDIT)
# GRADED FUNCTION scaled_dot_product_attention
def scaled_dot_product_attention(q, k, v, mask):
# mentor edit: code removed
return output, attention_weights
The implementation is correct. still an assertion error is thrown stating “Wrong masked weights”. what’s the issue?
TMosh
December 16, 2024, 8:08am
2
Please do not share your code on the forum. That is not allowed by the Code of Conduct.
If a mentor needs to see your code, we will contact you with instructions.
I have edited your post to remove the code.
Please see this text from the markdown:
Reminder : The boolean mask parameter can be passed in as none
or as either padding or look-ahead.
Multiply (1. - mask) by -1e9 before applying the softmax.
1 Like