Course 5, Week 4: Transformer Class

I am a little confused by the last part of the Transformer-architecture Task.
I got the following error:

AssertionError Traceback (most recent call last)
—> 65 Transformer_test(Transformer)

in Transformer_test(target)
47 assert np.allclose(translation[0, 0, 0:8],
48 [[0.02664799, 0.02222014, 0.01641812, 0.02407483,
—> 49 0.04251551, 0.02240461, 0.01556584, 0.03741234]]), “Wrong values in outd”
51 keys = list(weights.keys())

AssertionError: Wrong values in outd

I simply called the three functions in the transformer class:
enc_output = self.encoder(inp, training, enc_padding_mask )
dec_output, attention_weights = self.decoder(inp, enc_output, training, look_ahead_mask, dec_padding_mask)
final_output = self.final_layer(dec_output)

Can someone explain me, why my output doesnt fit?

Take a look at the slide in week4 lecture, encoder input is source language sentence, decoder input is target language sentence.


Having the same issue. Despite passing all the previous tests

enc_output = self.encoder(inp,training,enc_padding_mask)

dec_output, attention_weights = self.decoder(tar, enc_output, training, look_ahead_mask, dec_padding_mask)

final_output = self.final_layer(dec_output)

Keep getting “Wrong values in outd”

It passes the final grader 100/100 but I am unable to pass the test for that part of the code. Not sure which is right.

It looks like the new revision (updated yesterday) forgot to update Transformer_test. The grader is correct. If you passed grader, it means you’re awesome!
BTW, Transformer_test should be:


    assert np.allclose(translation[0, 0, 0:8],
                       [[0.02616475, 0.02074359, 0.01675757, 0.025527,
                         0.04473696, 0.02171909, 0.01542725, 0.03658631]]), "Wrong values in outd"

    assert np.allclose(weights[keys[0]][0, 0, 1], [0.4992985, 0.5007015, 0., 0., 0.]), f"Wrong values in weights[{keys[0]}]"

1 Like

Thanks, I was missing the correct output to the decoder.
This NN Arch is hard to follow.