C4W2 Decoder passes all ex3 test but throws an error in Transformer ex4 (positional encoding)

As in the subject line, my implementation of the decoder passed all tests for exercise 3, but the line with positional encoding causes an error in Transformer in exercise 4.

Any help will be very much appreciated, I have no clue what might have gone wrong.

1 Like

How are you finding seq_len?

1 Like

seq_len = tf.shape(x)[1]

This is a line from the provided code (before the START CODE HERE annotation), I didn’t change anything.

2 Likes

Could someone please try to help? It’s really frustrating, I’ve been stuck with the issue and don’t have a clue why the decoder worked in the tests but fails in another exercise… I’d be really grateful for any hint.