C5_W4_A1_Transformer_Subclass_v1 UNQ_C6 Wrong values

Hi all,

I’ve already looked on this forum for an answer. But I’m stuck at

UNQ_C6 (UNIQUE CELL IDENTIFIER, DO NOT EDIT)

GRADED FUNCTION DecoderLayer

class DecoderLayer(tf.keras.layers.Layer):

attn_w_b1 and attn_w_b2 are both correct (no error on assert)
but it end with:
→ 182 assert np.allclose(out[0, 0], [-0.22109576, -1.5455486, 0.852692, 0.9139523]), “Wrong values in out”

AssertionError: Wrong values in out

My values of out[0,0]
print(out3[0,0])
tf.Tensor([-1.7139359 0.7697465 0.36217463 0.58201486], shape=(4,), dtype=float32)

I’ve checked the described steps of block 3 and don’t see any error in my code. but there must be on. Can I send my code (private message of course) to someone, please?

Hey @Erwin_van_Dijk,
Welcome back to the community. Please DM me your code for this function, so that we can figure out where exactly does the error reside in your code.

Cheers,
Elemento

Sometimes it is good, to start from scratch again and compare with the precious attemt.
I should have used Q1 for the sum, but I had used mult_attn_out1. I didn’t see it until I started with a fresh and empty assigment.

1 Like