Week 4 A1 UNQ_C6: Wrong values in attn_w_b2

The first three lines of codes are below. attn_weights_block1 passed test, but attn_weights_block2 didn’t. There are only 3 inputs for self.mha2(), but I cannot figure out which ones are incorrect… Please help.

mult_attn_out1, attn_weights_block1 = self.mha1(x, x, x, look_ahead_mask, return_attention_scores=True)
Q1 = self.layernorm1(mult_attn_out1 + x)
mult_attn_out2, attn_weights_block2 = self.mha2(enc_output, enc_output, Q1, padding_mask, return_attention_scores=True)


AssertionError Traceback (most recent call last)
in
1 # UNIT TEST
----> 2 DecoderLayer_test(DecoderLayer, create_look_ahead_mask)

~/work/W4A1/public_tests.py in DecoderLayer_test(target, create_look_ahead_mask)
179
180 assert np.allclose(attn_w_b1[0, 0, 1], [0.5271505, 0.47284946, 0.], atol=1e-2), “Wrong values in attn_w_b1. Check the call to self.mha1”
→ 181 assert np.allclose(attn_w_b2[0, 0, 1], [0.32048798, 0.390301, 0.28921106]), “Wrong values in attn_w_b2. Check the call to self.mha2”
182 assert np.allclose(out[0, 0], [-0.22109576, -1.5455486, 0.852692, 0.9139523]), “Wrong values in out”
183

AssertionError: Wrong values in attn_w_b2. Check the call to self.mha2

The order of the first three operands to self.mha2() are incorrect.