and
How to solve this?
Same question here. I was following the TF transformer guideline.
{mentor edit: hacks removed - no longer necessary}
{mentor edit: hacks removed - no longer necessary}
I have the same issue. Problem is something else in my opinion. I got the same output as you.
My DecoderLayer part works fine, but the decoder fails. This is my decoder call function coding part:
{mentor edit: code removed}
Mine is same as yours, Decoder call still fails. Here is my Decoder Layer, I have commented out two assertions and it passed the tests:
{mentor edit: code removed}
{mentor edit: hacks removed - no longer necessary}
assert np.allclose(outd[1, 1], [-0.34560338, -0.8762897, -0.4767484, 1.6986415]), “Wrong values in outd”
Still the same error.
{mentor edit: hacks removed - no longer necessary}
When I have done it Decoder_test fails as follows:
assert np.allclose(outd[1, 1], [-0.34560338, -0.8762897, -0.4767484, 1.6986415]), “Wrong values in outd”
see my previous shared Decoder codes
{mentor edit: code removed}
{mentor edit: hacks removed - no longer necessary}
Thanks for the tip. Doing it like this passes the subsequent tests and then proceeds to pass the autograder when submitted too. A shame this last lab has these inconsistencies, as noted in another thread here: Pedagogy of C5-W4-A1 (-Ex4: Encoder Layer) This makes it very confusing to finish the lab properly. Thankfully we have our helpful classmates here on Discourse to lend a hand!
{mentor edit: hacks removed - no longer necessary}
{mentor edit: hacks removed - no longer necessary}
thanks man it worked!!!
Thx man! You really saved me, after whole day trying to debug my code…
{mentor edit: hacks removed - no longer necessary}
{mentor edit: hacks removed - no longer necessary}