W4, scaled_dot_product_attenion, Output must be a tensor

Hello, I think I am having the output value same as the one asserted in the test, but the following mesage appears:

my output: 
[[0.74105227 0.15705977]
 [0.7227253  0.16817567]
 [0.6637989  0.2039163 ]]
---------------------------------------------------------------------------
AssertionError                            Traceback (most recent call last)
<ipython-input-146-00665b20febb> in <module>
      1 # UNIT TEST
----> 2 scaled_dot_product_attention_test(scaled_dot_product_attention)

~/work/W4A1/public_tests.py in scaled_dot_product_attention_test(target)
     62                                    [0.33620113, 0.33620113, 0.12368149, 0.2039163 ]]), "Wrong unmasked weights"
     63 
---> 64     assert tf.is_tensor(attention), "Output must be a tensor"
     65     assert tuple(tf.shape(attention).numpy()) == (q.shape[0], v.shape[1]), f"Wrong shape. We expected ({q.shape[0]}, {v.shape[1]})"
     66     assert np.allclose(attention, [[0.74105227, 0.15705977],

AssertionError: Output must be a tensor

I have looked at many posts and those with the same error were defining dk wrong, so I defined it as dk = np.shape(k)[0]. If someone wants to take a look my lab ID is: mnlvckyqqczx

Thank you!

Hi! have you try tf.is_tensor() function. If it’s not a tensor, you’ll need to modify your output to be a tf tensor. Please note numpy arrays does not mean it is a tensor.

Thank you, I had np.matmul instead of tf.matmul, that’s why the last step was not performing an operation between tensors.

1 Like