I’m stuck. Can’t find the problem. I’m applying the mask and -1e9 when computing the scaled_attention_logits. I don’t understand why it is += instead of multiplication. Keep getting the following error.

I’m using tf.nn.softmax to compute attention_weights on the scaled_attention_logits, axis = 1.

AssertionError Traceback (most recent call last)

in

1 # UNIT TEST

----> 2 scaled_dot_product_attention_test(scaled_dot_product_attention)

~/work/W4A1/public_tests.py in scaled_dot_product_attention_test(target)

73 assert np.allclose(weights, [[0.30719590187072754, 0.5064803957939148, 0.0, 0.18632373213768005],

74 [0.3836517333984375, 0.3836517333984375, 0.0, 0.2326965481042862],

—> 75 [0.3836517333984375, 0.3836517333984375, 0.0, 0.2326965481042862]]), “Wrong masked weights”

76 assert np.allclose(attention, [[0.6928040981292725, 0.18632373213768005],

77 [0.6163482666015625, 0.2326965481042862],

AssertionError: Wrong masked weights