C5W4E5 - Course 5, Week 4, Exercise 5 - Encoder

I am stuck on exercise 5 (implementing the call method for the Encoder class).

Below is the output I am getting (from a print statement):

tf.Tensor(
[[[-0.6906099   1.098871   -1.2605863   0.8523253 ]
  [ 0.7319227  -0.3826024  -1.4507655   1.1014452 ]
  [ 1.0995712  -1.1686687  -0.8088865   0.87798405]]

 [[-0.7194743   1.1862508  -1.2249074   0.7581308 ]
  [ 0.58073866 -0.4223178  -1.3967535   1.2383327 ]
  [ 1.153197   -1.1432416  -0.83053297  0.82057744]]], shape=(2, 3, 4), dtype=float32)

From what I can gather from downloading public_tests.py, they are expecting the below:

[[[-0.6906098 ,  1.0988709 , -1.260586  ,  0.85232526],
[ 0.7319228 , -0.3826024 , -1.4507656 ,  1.1014453 ],
[ 1.0995713 , -1.1686686 , -0.80888665,  0.8779839 ]],
[[-0.4612937 ,  1.0697356 , -1.4127715 ,  0.8043293 ],
[ 0.27027237,  0.28793618, -1.6370889 ,  1.0788803 ],
[ 1.2370994 , -1.0687275 , -0.8945037 ,  0.7261319 ]]])

…aaaand below is the error message I get:

---------------------------------------------------------------------------
AssertionError                            Traceback (most recent call last)
<ipython-input-27-68aa9b8f5815> in <module>
      1 # UNIT TEST
----> 2 Encoder_test(Encoder)

~/work/W4A1/public_tests.py in Encoder_test(target)
    124                         [[-0.4612937 ,  1.0697356 , -1.4127715 ,  0.8043293 ],
    125                          [ 0.27027237,  0.28793618, -1.6370889 ,  1.0788803 ],
--> 126                          [ 1.2370994 , -1.0687275 , -0.8945037 ,  0.7261319 ]]]), "Wrong values case 1"
    127 
    128     encoderq_output = encoderq(x, True, np.array([[[[1., 1., 1.]]], [[[1., 1., 0.]]]]))

AssertionError: Wrong values case 1

Something that I think is also worthy of noting is that the test cases for every other exercise (before and after this) works, including Exercise 8 (Transformer), which actually implements this class and works just fine (again, by passing all of its test cases).

Any help/insight on this would be much appreciated.

Could be an issue in Encoder(), or more likely in EncoderLayer().

Update: It was an error in EncoderLayer(), in the call to self.mha(…)