Stuck on the final full Transformer step. I’ve passed the tests for all the previous layers (87/100 on the grader)
It seems from the forums etc, that quite a few are tripped up here, by not passing the correct first arg to the decoder, but I believe what I have here is correct. Any guidance on what I’m doing wrong very gratefully received.
Transformer code
AssertionError Traceback (most recent call last)
in
1 # UNIT TEST
----> 2 Transformer_test(Transformer, create_look_ahead_mask, create_padding_mask)
I’ve since been able to work out where I was going wrong. Turns out my Encoder and Decoder implementations both had the same error, in how I was scaling the embeddings, but in both cases these errors passed the in-notebook tests and also passed the autograder for UNQ_C5 and UNQ_C7, but were causing failures in the final Transformer UNQ_C8 tests.
When I correctly scaled the embeddings, things improved.
I am getting the exact same error with all other tests passed. However, I went through my implementation of the scaling and I don’t see any discrepancies. So either im missing something in that documentation or there are multiple possible sources of the error that I don’t know how to track down.
(→ 288 0.014861834, 0.034384135, 0.054789476, 0.032087505]), “Wrong values in outd”