The exercise’s hint / comment is :

softmax is normalized on the last axis (seq_len_k) so that the scores add up to 1.

However, I cannot pass this exercise, and suspect that I have got softmax wrong.
Is there any hint / reference manual documentation on how I can get the last axis. I have used index of 0 for scaled_attention_logits but cannot pass the test.

1 Like

Hi @pongyuenlam,
Could you please send me your code via direct message? Then I can better assist you.
Best, Raphael

1 Like

rather than sharing hint, share the screenshot of test you are failing, to guide you in the right direction

@Deepti_Prasad Yes, I have just sent screenshots to you via direct message.

1 Like

hi @pongyuenlam

@Raphael_Reinauer1 asked you to send him codes, did you not send him these details? As he has come forward first to assist you, let him guide through.

Let me know if you still need assistance.

The below screenshot is error screenshot shared by the learner, @Raphael_Reinauer1

@Deepti_Prasad Yes, I had sent him code details earlier.


Hi @pongyuenlam

Ok probably he is away, so I am responding.

  1. For code line
    Multiply q and k transposed, you have recalled the transpose part incorrectly, you need to mention q and k and then transpose_b=True

  2. for code line, scale matmul_qk with the square root of dk you incorrectly reshaped k, you have used k.shape but you are suppose to use tf.shape(k). your code line is correct except the reshape part, use tf.shape(k)[-1]

  3. To add the mask to the scaled tensor, instructions given are

  • Multiply (1. - mask) by -1e9 before adding it to the scaled attention logits.
    But you have added just mask which is incorrect
  1. softmax is normalized on the last axis (seq_len_k) so that the scores add up to 1.
    You do not require to take length of scaled_attention_logits, kindly remove it.
    You only require to use the tf.keras.activations.softmax to the scaled_attention_logits without [0]

Let me know if you have any new error.


@Deepti_Prasad @Deepti_Prasad
Dear Deepti_Prased,
Thanks! I can pass the test now and can proceed to the next exercise.
Dear Deepti_Prased,
For your information.


Thanks, @Deepti_Prasad, for jumping in. I only looked at my inbox now.