C5 W4 Ex 7 - Output must be a tensor

Hi, I’m stuck here. This is saying the output must be a tensor. However I couldn’t find any place where I used np instead of tf, (except for when asked for (np.newaxis/arange))
Can you please help?
Thanks.

AssertionError Traceback (most recent call last)
in
1 # UNIT TEST
----> 2 Decoder_test(Decoder, create_look_ahead_mask, create_padding_mask)

~/work/W4A1/public_tests.py in Decoder_test(target, create_look_ahead_mask, create_padding_mask)
226 assert type(att_weights) == dict, “Wrong type for att_weights[0]. Output must be a tensor”
227 assert len(keys) == 2 * num_layers, f"Wrong length for attention weights. It must be 2 x num_layers = {2*num_layers}"
→ 228 assert tf.is_tensor(att_weights[keys[0]]), f"Wrong type for att_weights[{keys[0]}]. Output must be a tensor"
229 shape1 = (x.shape[0], num_heads, x.shape[1], x.shape[1])
230 assert tuple(tf.shape(att_weights[keys[1]]).numpy()) == shape1, f"Wrong shape. We expected {shape1}"

AssertionError: Wrong type for att_weights[decoder_layer1_block1_self_att]. Output must be a tensor

1 Like

I have not gotten to this assignment yet, but the error specifically tells you which variable is the wrong type. What type is att_weights[decoder_layer1_block1_self_att]? Maybe it’s not a numpy array. Maybe it’s a Function object or a Layer object.

There’s one way to find out:

print(type(att_weights[decoder_layer1_block1_self_att]))

1 Like

Thanks, so this is what I get.
How to proceed?

—> 73 print(type(att_weights[decoder_layer1_block1_self_att]))
74
75 # x.shape == (batch_size, target_seq_len, embedding_dim)

NameError: name ‘att_weights’ is not defined

Here is the whole traceback:

in
1 # UNIT TEST
----> 2 Decoder_test(Decoder, create_look_ahead_mask, create_padding_mask)

~/work/W4A1/public_tests.py in Decoder_test(target, create_look_ahead_mask, create_padding_mask)
218 target_vocab_size,
219 maximum_position_encoding)
→ 220 outd, att_weights = decoderk(x, encoderq_output, False, look_ahead_mask, None)
221 assert tf.is_tensor(outd), “Wrong type for outd. It must be a dict”
222 assert np.allclose(tf.shape(outd), tf.shape(encoderq_output)), f"Wrong shape. We expected { tf.shape(encoderq_output)}"

/opt/conda/lib/python3.7/site-packages/tensorflow/python/keras/engine/base_layer.py in call(self, *args, **kwargs)
1010 with autocast_variable.enable_auto_cast_variables(
1011 self._compute_dtype_object):
→ 1012 outputs = call_fn(inputs, *args, **kwargs)
1013
1014 if self._activity_regularizer:

in call(self, x, enc_output, training, look_ahead_mask, padding_mask)
71 attention_weights[‘decoder_layer{}_block2_decenc_att’.format(i+1)] = None
72 # END CODE HERE
—> 73 print(type(att_weights[decoder_layer1_block1_self_att]))
74
75 # x.shape == (batch_size, target_seq_len, embedding_dim)

NameError: name ‘att_weights’ is not defined

1 Like

You have to replace all the None with the correct terms. I still see None in your code.

Regarding the att_weights, it is the same as attention_weights but the unit test uses the former while the notebook uses the latter term. @Mubsi! Don’t you think the unit test should use similar terms as in the notebook? If so, I will post a Git issue.

Right. I forgot to update the attention weights! Thanks! :blush: