[Week4] Transformer Network Decoder_test/Transformer_test

and

How to solve this?

2 Likes

Same question here. I was following the TF transformer guideline.

{mentor edit: hacks removed - no longer necessary}

image
{mentor edit: hacks removed - no longer necessary}

2 Likes

I have the same issue. Problem is something else in my opinion. I got the same output as you.

My DecoderLayer part works fine, but the decoder fails. This is my decoder call function coding part:

{mentor edit: code removed}

Mine is same as yours, Decoder call still fails. Here is my Decoder Layer, I have commented out two assertions and it passed the tests:

{mentor edit: code removed}

1 Like

{mentor edit: hacks removed - no longer necessary}

3 Likes

assert np.allclose(outd[1, 1], [-0.34560338, -0.8762897, -0.4767484, 1.6986415]), “Wrong values in outd”

Still the same error.

1 Like

{mentor edit: hacks removed - no longer necessary}

3 Likes

When I have done it Decoder_test fails as follows:

assert np.allclose(outd[1, 1], [-0.34560338, -0.8762897, -0.4767484, 1.6986415]), “Wrong values in outd”

see my previous shared Decoder codes

scale embeddings by multiplying by the square root of their dimension

{mentor edit: code removed}

{mentor edit: hacks removed - no longer necessary}

5 Likes

Thanks for the tip. Doing it like this passes the subsequent tests and then proceeds to pass the autograder when submitted too. A shame this last lab has these inconsistencies, as noted in another thread here: Pedagogy of C5-W4-A1 (-Ex4: Encoder Layer) This makes it very confusing to finish the lab properly. Thankfully we have our helpful classmates here on Discourse to lend a hand! :grin:

{mentor edit: hacks removed - no longer necessary}

3 Likes

{mentor edit: hacks removed - no longer necessary}

1 Like

thanks man it worked!!!

Thx man! You really saved me, after whole day trying to debug my code…

{mentor edit: hacks removed - no longer necessary}

{mentor edit: hacks removed - no longer necessary}

5 Likes