Week 4 - Exercise 4 - Encoder Layer

Hi, once again, as far as I know, I’m doing things correctly. I’ve looked at the forum and followed the instructions, but still I’m facing the following error. I can send my code privately if a mentor contacts. Thank You

1 Like

Check this thread.

Hi. In the ffn_output, the instructios say that pass the output of the multi-head attention layer through a ffn. The instructions should attach (“skip connection”) here too like it’s mentioned in encoder_layer_out

Can you please explain me how you did it ? What should i add in my code?

You just need to pass the output of the multi-head attention layer (skip connection) through a ffn before applying the dropout.

Your code is incorrect. What are self_attn_output and mult_attn_out ? The code structure has already been given to you. You just need to write the correct terms, not write any new lines or any new code.

Here is the code structure given to you. You just need to replace the None with correct terms. Don’t write anything else like self_attn_output or mult_attn_out .

# START CODE HERE
# calculate self-attention using mha(~1 line).
# Dropout is added by Keras automatically if the dropout parameter is non-zero during training
self_mha_output = None  # Self attention (batch_size, input_seq_len, fully_connected_dim)
  
# skip connection
# apply layer normalization on sum of the input and the attention output to get the  
# output of the multi-head attention layer (~1 line)
skip_x_attention = None  # (batch_size, input_seq_len, fully_connected_dim)

# pass the output of the multi-head attention layer through a ffn (~1 line)
ffn_output = None  # (batch_size, input_seq_len, fully_connected_dim)
  
# apply dropout layer to ffn output during training (~1 line)
# use `training=training` 
ffn_output = None
  
# apply layer normalization on sum of the output from multi-head attention (skip connection) and ffn output to get the
# output of the encoder layer (~1 line)
encoder_layer_out = None  # (batch_size, input_seq_len, embedding_dim)
# END CODE HERE