Completely stuck at DecodeLayer Exercise 5 week 4

Dear all,

I am completely stuck when calling the mha2 function in this exercise, I receive the following error

assert np.allclose(attn_w_b2[0, 0, 1], [0.33365652, 0.32598493, 0.34035856]), “Wrong values in attn_w_b2. Check the call to self.mha2”

I have tested all possible combinations with the input parameters and out1.

I also have seen all the threads about this discussion. Any help will be appreciated

{edited, sample code removed}

1 Like

If you use that and still get an error, then either your out1 or enc_output are wrong.

Ok, I pass that step by switching out1, now I get errors in both tests, with and without padding mask. The values in out[0,0] do not match with those from the test template

I am idiot, I found the error because I was not doing the 3 layer normalization correctly

I have succeded in the end. Thanks for your help Tom