C5_W4_A1 scaled_dot_product_attention assistance requested

I am struggling solving this exercise. I have implemented what should be the right steps. I cannot get past a Wrong Masked Weights error. I would greatly appreciate guidance:

AssertionError                            Traceback (most recent call last)
<ipython-input-118-00665b20febb> in <module>
      1 # UNIT TEST
----> 2 scaled_dot_product_attention_test(scaled_dot_product_attention)

~/work/W4A1/public_tests.py in scaled_dot_product_attention_test(target)
     73     assert np.allclose(weights, [[0.30719590187072754, 0.5064803957939148, 0.0, 0.18632373213768005],
     74                                  [0.3836517333984375, 0.3836517333984375, 0.0, 0.2326965481042862],
---> 75                                  [0.3836517333984375, 0.3836517333984375, 0.0, 0.2326965481042862]]), "Wrong masked weights"
     76     assert np.allclose(attention, [[0.6928040981292725, 0.18632373213768005],
     77                                    [0.6163482666015625, 0.2326965481042862],

AssertionError: Wrong masked weights

You have posted in the General Discussions forum.

Please use the “pencil” tool in the thread title to move your post to the forum for the course you’re attending.

Thanks!

This suggests an error in how your code is using the “mask” variable in this part of the code:

image

Thanks. I messed up the order of operations, placing the -1e9 inside the parentheses instead of outside.