C5_W4_A1_Transformer_Subclass_v1 : UNQ_C8

Stuck on the final full Transformer step. I’ve passed the tests for all the previous layers (87/100 on the grader)
It seems from the forums etc, that quite a few are tripped up here, by not passing the correct first arg to the decoder, but I believe what I have here is correct. Any guidance on what I’m doing wrong very gratefully received.

Transformer code

AssertionError Traceback (most recent call last)
in
1 # UNIT TEST
----> 2 Transformer_test(Transformer, create_look_ahead_mask, create_padding_mask)

~/work/W4A1/public_tests.py in Transformer_test(target, create_look_ahead_mask, create_padding_mask)
286 assert np.allclose(translation[0, 0, 0:8],
287 [0.017416516, 0.030932948, 0.024302809, 0.01997807,
→ 288 0.014861834, 0.034384135, 0.054789476, 0.032087505]), “Wrong values in outd”
289
290 keys = list(weights.keys())

AssertionError: Wrong values in outd

I’ve since been able to work out where I was going wrong. Turns out my Encoder and Decoder implementations both had the same error, in how I was scaling the embeddings, but in both cases these errors passed the in-notebook tests and also passed the autograder for UNQ_C5 and UNQ_C7, but were causing failures in the final Transformer UNQ_C8 tests.

When I correctly scaled the embeddings, things improved.

I used the implementation details in Transformer model for language understanding  |  Text  |  TensorFlow to be able to debug my problems. I doubt I’d have been able to find it without this additional information.

Thanks for your report.
There are some updates planned for this assignment to address the issues.

I am getting the exact same error with all other tests passed. However, I went through my implementation of the scaling and I don’t see any discrepancies. So either im missing something in that documentation or there are multiple possible sources of the error that I don’t know how to track down.

(→ 288 0.014861834, 0.034384135, 0.054789476, 0.032087505]), “Wrong values in outd”

edit: to bump and @TMosh

Which “exact same error” are you referring to? Please be specific and post your own error messages or assert stack.

As in char for char the same error:


AssertionError Traceback (most recent call last)
in
1 # UNIT TEST
----> 2 Transformer_test(Transformer, create_look_ahead_mask, create_padding_mask)

~/work/W4A1/public_tests.py in Transformer_test(target, create_look_ahead_mask, create_padding_mask)
286 assert np.allclose(translation[0, 0, 0:8],
287 [0.017416516, 0.030932948, 0.024302809, 0.01997807,
→ 288 0.014861834, 0.034384135, 0.054789476, 0.032087505]), “Wrong values in outd”
289
290 keys = list(weights.keys())

AssertionError: Wrong values in outd

Please send me your code for the Transformer() module via a private message.

Or you can search the Discourse forum for the phrase “wrong values in outd”, I think there are several of threads about this error.

You are correct there are many but only this topic discusses failure on line 288 of the test.

Hi,

I was getting extaclty the same issue as you, every thing passed, except on UNQ_C8. I fixed my Function ‘scaled_dot_product_attention’ and i got 100%.

I was very lucky to find your comment. Thank you very much for point the documentation.

Please mentor fix this notebook and make it more easy to understand!!! Thank you