NLP C4 week 2 Transformer Test case error

In the programming assignment of week 2 of course 4, for exercise 4 - transformer, the test case is failing, here’s the output of the test case:

Failed test case: Wrong values in summary.
Expected: [0.04855702, 0.03407773, 0.01294427, 0.05483282, 0.03182802, 0.01409046, 0.02963346, 0.04003222]
Got: [0.04887775 0.03356204 0.01315448 0.05506424 0.0318282 0.01480375
0.02955461 0.03947698]

Failed test case: Wrong values in weights[decoder_layer1_block1_self_att].
Expected: [0.481374, 0.51862603, 0.0, 0.0, 0.0]
Got: [0.4926643 0.5073357 0. 0. 0. ]

Failed test case: Wrong values in summary.
Expected: [0.05015587, 0.02734077, 0.01308834, 0.04876801, 0.03092919, 0.02046618, 0.02923589, 0.03272967]
Got: [0.050137 0.02713667 0.01314374 0.04854858 0.03089247 0.02079512
0.02920983 0.03248784]

The expected output and the output values provided by the code don’t have much difference, but I couldn’t understand what’s causing the code not to provide the exact output.

The ‘test your function’ block for the same code is working fine, its output matches the ‘expected output’.

Any help would be appreciated!

1 Like

Hi @Abhay_Pancholi

One place to check could be if you have a typo in one of your layernorms (like, layernorm2 instead of layernorm3), since values do not differ too much.

1 Like

Hello @arvyzukai sir,
Thank you for the reply,
I checked for a possible typo in the code, but none.

No problem @Abhay_Pancholi, I’m happy to help. Then maybe the culprit is the dropout layer?

okay sir, but I didn’t get it like the dropout layer is applied after the fully connected layer. Also the value of the dropout_rate parameter is set to 0.1 by default.

If anyone in the future encounters the same values, chances are that you’re feeding the decoder with input sentence, not the output as you should.

Even though the values differ only slightly in this test case but the principle is very different. In other words, the first argument for the decoder in this case should be the output_sentence (the summary “so far”).

Cheers

2 Likes