Week5 exercise5 - encoder call()

I have an error that says :

 ---------------------------------------------------------------------------
AssertionError                            Traceback (most recent call last)
<ipython-input-58-2f05cf553abf> in <module>
     28     print("\033[92mAll tests passed")
     29 
---> 30 Encoder_test(Encoder)

<ipython-input-58-2f05cf553abf> in Encoder_test(target)
     24                         [[-0.3489219,   0.31335592, -1.3568854,   1.3924513 ],
     25                          [-0.08761203, -0.1680029,  -1.2742313,   1.5298463 ],
---> 26                          [ 0.2627198,  -1.6140151,   0.2212624 ,  1.130033  ]]]), "Wrong values"
     27 
     28     print("\033[92mAll tests passed")

AssertionError: Wrong values

My code is:

        seq_len = tf.shape(x)[1]
        
        # START CODE HERE
        x = self.embedding(x) 
        x *= tf.math.sqrt(tf.cast(self.embedding_dim, tf.float32)) 
        x += self.pos_encoding[:, :seq_len, :]
        x = self.dropout(x, training=training)
 
        for i in range(self.num_layers):
            self.enc_layers[i](x, training, mask)

I don’t know where I did wrong with my code ;( can someone help me?

Maybe try tf.sqrt instead of tf.math.sqrt I don’t think it should make a difference though? Maybe also try saving and clearing the kernel and running everything over again? You could try adding a print to the assertion that’s failing and comparing the results you have with the expected ones, sometimes that helps.

I tried everything but nothing works :frowning:

Have you tried running the auto-grader to see if it catches some not-quite-right code in a previous exercise?

The code you posted looks fine, maybe the error is in the previous step

@Anomy, @ccc888 I checked previous codes and everything works fine…
this is my previous code for encoder layer, which also passes the test…
do you think there’s anything wrong with this code?

attn_output = self.mha(x, x, x, mask)
attn_output = self.dropout1(attn_output, training=training)
out1 = self.layernorm1(x + attn_output)
ffn_output = self.ffn(out1)
ffn_output = self.dropout2(ffn_output, training=training)
out2 = self.layernorm2(out1 + ffn_output)
return out2

Sorry, my subscription ended yesterday so I don’t have access to the assignment anymore to look at the context. Have you tried labeling the parameters for mha?

Hey try this:

for i in range(self.num_layers):
x =self.enc_layers[i](x, training, mask)
# END CODE HERE

2 Likes

@John666 Thanks!! Can’t believe the problem was as simple as this…