I got the following error, if I add x to self.enc_layers(x):
57 # Pass the output through the stack of encoding layers
58 for i in range(self.num_layers):
—> 59 x = self.enc_layers(x)
60 # END CODE HERE
61
If I remove x from self.enc_layers, I got the following:
~/work/W4A1/public_tests.py in Encoder_test(target)
116 encoderq_output = encoderq(x, True, None)
117
→ 118 assert tf.is_tensor(encoderq_output), “Wrong type. Output must be a tensor”
119 assert tuple(tf.shape(encoderq_output).numpy()) == (x.shape[0], x.shape[1], embedding_dim), f"Wrong shape. We expected ({x.shape[0]}, {x.shape[1]}, {embedding_dim})"
120 assert np.allclose(encoderq_output.numpy(),
AssertionError: Wrong type. Output must be a tensorTypeError: ‘ListWrapper’ object is not callable