I have managed to do most of the assignment. I have successfully implemented both the encoder and decoder, however, in the final Transformer block, I am getting “Wrong values in translation” as my output. I have looked through the transformer block and can’t find anything wrong with my implementation. Could anyone assist please.
Please share your full error.
AssertionError Traceback (most recent call last)
in
1 # UNIT TEST
----> 2 Transformer_test(Transformer, create_look_ahead_mask, create_padding_mask)
~/work/W4A1/public_tests.py in Transformer_test(target, create_look_ahead_mask, create_padding_mask)
286 assert np.allclose(translation[0, 0, 0:8],
287 [0.017416516, 0.030932948, 0.024302809, 0.01997807,
→ 288 0.014861834, 0.034384135, 0.054789476, 0.032087505]), “Wrong values in translation”
289
290 keys = list(weights.keys())
AssertionError: Wrong values in translation
Please send me your code of Transformer
in a private message. Click my name and message.
Thank you for sharing your code.
Check your self.decoder
.
Hint: Encoder takes input sentence. What Decoder takes?
I initially had the same issue as the OP. I found my mistake now (different than OP I think) and passed the tests. Just for reference I was slightly thrown off by the code comment here:
# pass decoder output through a linear layer and softmax (~2 lines)
What confused me was the “2 lines” part.
I got the same error and went through all the similar posts but still having issues with my code. I checked Transformer, Decoder, Encoder but still can’t fix the issue. Can I direct message someone who can take a look at my code?
Hi, Daniel.
Yes, please check your DMs for a message from me about how to share code.