Sequence Model Week4 Encoder Error

I am using below code for encoder, I don’t see anything wrong in the code - However, I’m getting Assertion error . Can you please check the code?

###ERROR

AssertionError Traceback (most recent call last)
in
1 # UNIT TEST
----> 2 Encoder_test(Encoder)

~/work/W4A1/public_tests.py in Encoder_test(target)
124 [[-0.4612937 , 1.0697356 , -1.4127715 , 0.8043293 ],
125 [ 0.27027237, 0.28793618, -1.6370889 , 1.0788803 ],
→ 126 [ 1.2370994 , -1.0687275 , -0.8945037 , 0.7261319 ]]]), “Wrong values case 1”
127
128 encoderq_output = encoderq(x, True, np.arr # UNQ_C5 (UNIQUE CELL IDENTIFIER, DO NOT EDIT)

GRADED FUNCTION

class Encoder(tf.keras.layers.Layer):
“”"
The entire Encoder starts by passing the input to an embedding layer
and using positional encoding to then pass the output through a stack of
encoder Layers

"""  
def __init__(self, num_layers, embedding_dim, num_heads, fully_connected_dim, input_vocab_size,
           maximum_position_encoding, dropout_rate=0.1, layernorm_eps=1e-6):
    super(Encoder, self).__init__()

    self.embedding_dim = embedding_dim
    self.num_layers = num_layers

    self.embedding = Embedding(input_vocab_size, self.embedding_dim)
    self.pos_encoding = positional_encoding(maximum_position_encoding, 
                                            self.embedding_dim)


    self.enc_layers = [EncoderLayer(embedding_dim=self.embedding_dim,
                                    num_heads=num_heads,
                                    fully_connected_dim=fully_connected_dim,
                                    dropout_rate=dropout_rate,
                                    layernorm_eps=layernorm_eps) 
                       for _ in range(self.num_layers)]

    self.dropout = Dropout(dropout_rate)
    
def call(self, x, training, mask):
    """
    Forward pass for the Encoder
    
    Arguments:
        x -- Tensor of shape (batch_size, input_seq_len)
        training -- Boolean, set to true to activate
                    the training mode for dropout layers
        mask -- Boolean mask to ensure that the padding is not 
                treated as part of the input
    Returns:
        out2 -- Tensor of shape (batch_size, input_seq_len, fully_connected_dim)
    """
    #mask = create_padding_mask(x)
    print('x ', x.shape)
    
    seq_len=x.shape[1]

    # START CODE HERE
    # Pass input through the Embedding layer
    x = self.embedding(x)  # (batch_size, input_seq_len, fully_connected_dim)
    print('x1 ', type(x))
    # Scale embedding by multiplying it by the square root of the embedding dimension
    
    x *= tf.math.sqrt(tf.cast(self.embedding_dim, tf.float32))
    print('x2 ', type(x))
    # Add the position encoding to embedding
    
    x += self.pos_encoding[:, :seq_len, :]
    print('x3 ', type(x))
    
    # Pass the encoded embedding through a dropout layer
    y = self.dropout(x, training=training)
    print('x4 ', type(x), x.shape)
    # Pass the output through the stack of encoding layers 
    print('num_layers ', self.num_layers)
    
    for i in range(self.num_layers):
        x = self.enc_layers[i](x, training, mask)#[i]
    
    print('enc_layers')

    # END CODE HERE

    return x  # (batch_size, input_seq_len, fully_connected_dim)ay([[[[1., 1., 1.]]], [[[1., 1., 0.]]]]))

AssertionError: Wrong values case 1

###CODE

3 Likes

Hi @forthe_websites3

The problem is due to the output of the dropout layer is not passed for stacking. Your code set the output of the dropout layer to ‘y’, rendering it redundant.