Week 4 UNQ_C3 Issue

I have been stuck on UNQ_C3 for a while as I’m highly certain it works properly, but I am not passing a test case. Here is my code for the block:

Code Removed due to honor code

The test I am having issues with is the one with the following mask:
mask = np.array([[1, 1, 0, 1], [1, 1, 0, 1], [1, 1, 0, 1]])

The weights output for this test case should be:

[[0.30719590187072754, 0.5064803957939148, 0.0, 0.18632373213768005],[0.3836517333984375, 0.3836517333984375, 0.0, 0.2326965481042862],[0.3836517333984375, 0.3836517333984375, 0.0, 0.2326965481042862]]

But I am getting the following as a value for attention_weights:

[[0. 0. 1. 0.]
 [0. 0. 1. 0.]
 [0. 0. 1. 0.]]

Also, I am receiving the following error just in case it’s useful:

AssertionError                            Traceback (most recent call last)
<ipython-input-97-89e3aab06b1a> in <module>
     30     print("\033[92mAll tests passed")
     31 
---> 32 scaled_dot_product_attention_test(scaled_dot_product_attention)

<ipython-input-97-89e3aab06b1a> in scaled_dot_product_attention_test(target)
     23     assert np.allclose(weights, [[0.30719590187072754, 0.5064803957939148, 0.0, 0.18632373213768005],
     24                                  [0.3836517333984375, 0.3836517333984375, 0.0, 0.2326965481042862],
---> 25                                  [0.3836517333984375, 0.3836517333984375, 0.0, 0.2326965481042862]]), "Wrong masked weights"
     26     assert np.allclose(attention, [[0.6928040981292725, 0.18632373213768005],
     27                                    [0.6163482666015625, 0.2326965481042862],

AssertionError: Wrong masked weights

Any help would be greatly appreciated! I’ll make sure to update my post to remove the code once the issue is resolved.

1 Like

I finally found the issue. I realized that the instructions didn’t state I should multiply “mask” by -1e9. Instead it states:
“Multiply (1. - mask) by -1e9 before applying the softmax.”
So, the equation looks like the following:

scaled_attention_logits += (1-mask)*-1e9
3 Likes