Error: Getting assertion error “Wrong masked error”
Function: scaled_dot_product_attention(q, k, v, mask)
Please anybody can help?
Error: Getting assertion error “Wrong masked error”
Function: scaled_dot_product_attention(q, k, v, mask)
Please anybody can help?
Please post a screen capture image showing all the assert messages.
There is a problem with either:
Thank you for your response Mr. TMosh.
Following is the code:
matmul_qk = tf.matmul(q, k, transpose_b=True)
dk = k.shape[0]
scaled_attention_logits = tf.math.divide(matmul_qk, np.sqrt(dk))
attention_weights = tf.keras.activations.softmax(scaled_attention_logits)