GRADED FUNCTION EncoderLayer

Pretty sure my code is good, but get this error

ameError Traceback (most recent call last)
in
1 # UNIT TEST
----> 2 EncoderLayer_test(EncoderLayer)

~/work/W4A1/public_tests.py in EncoderLayer_test(target)
82 def EncoderLayer_test(target):
83 q = np.array([[[1, 0, 1, 1], [0, 1, 1, 1], [1, 0, 0, 1]]]).astype(np.float32)
—> 84 encoder_layer1 = target(4, 2, 8)
85 tf.random.set_seed(10)
86 encoded = encoder_layer1(q, True, np.array([[1, 0, 1]]))

in init(self, embedding_dim, num_heads, fully_connected_dim, dropout_rate, layernorm_eps)
16 dropout=dropout_rate)
17
—> 18 self.ffn = FullyConnected(embedding_dim=embedding_dim,
19 fully_connected_dim=fully_connected_dim)
20

NameError: name ‘FullyConnected’ is not defined

Try restarting the kernel and running all the cells again.

Worked like a charm, thank you

Hello, i have the same problem. I already try to restart the kernel. Do you have any idea what is going on?

Restarting the Kernel alone is not enough. You have to run all the cells that come before that particular cell (EncoderLayer cell).

Thanks for the answer, I already did that.

Please answer these:

  1. Have you deleted any cell(s)?
  2. Have you changed the code of ungraded cell(s) or at any other place which you were not supposed to do?
  3. Have you passed all the above tests?
  1. & 2. I did nothing of these but maybe in a moment of distraction. If I have to answer I will say no.
  2. Yes, I passed all above tests. I can sh

Here is the EncoderLayer cell without my answers:

UNQ_C4 (UNIQUE CELL IDENTIFIER, DO NOT EDIT)

GRADED FUNCTION EncoderLayer

class EncoderLayer(tf.keras.layers.Layer):
“”"
The encoder layer is composed by a multi-head self-attention mechanism,
followed by a simple, positionwise fully connected feed-forward network.
This architecture includes a residual connection around each of the two
sub-layers, followed by layer normalization.
“”"
def init(self, embedding_dim, num_heads, fully_connected_dim,
dropout_rate=0.1, layernorm_eps=1e-6):
super(EncoderLayer, self).init()

    self.mha = MultiHeadAttention(num_heads=num_heads,
                                  key_dim=embedding_dim,
                                  dropout=dropout_rate)

    self.ffn = FullyConnected(embedding_dim=embedding_dim,
                              fully_connected_dim=fully_connected_dim)

    self.layernorm1 = LayerNormalization(epsilon=layernorm_eps)
    self.layernorm2 = LayerNormalization(epsilon=layernorm_eps)

    self.dropout_ffn = Dropout(dropout_rate)

def call(self, x, training, mask):
    """
    Forward pass for the Encoder Layer
    
    Arguments:
        x -- Tensor of shape (batch_size, input_seq_len, fully_connected_dim)
        training -- Boolean, set to true to activate
                    the training mode for dropout layers
        mask -- Boolean mask to ensure that the padding is not 
                treated as part of the input
    Returns:
        encoder_layer_out -- Tensor of shape (batch_size, input_seq_len, embedding_dim)
    """

I will try to delete the .ipynb file and start a new one.

Please let us know if getting a fresh copy of your assignment resolve the issue.

I’m back.
I notice this time there is this cell which defines the layer. I haven’t seen it before, maybe I deleted it.
Thanks for your help.

def FullyConnected(embedding_dim, fully_connected_dim):
return tf.keras.Sequential([
tf.keras.layers.Dense(fully_connected_dim, activation=‘relu’), # (batch_size, seq_len, dff)
tf.keras.layers.Dense(embedding_dim) # (batch_size, seq_len, d_model)
])

:+1: :+1: :+1: :slightly_smiling_face: :slightly_smiling_face: :slightly_smiling_face:~~~~~~~~~~

1 Like