Problem with UNQC4

couldn’t understand half of the syntax. I followed the guidelines and the function doesn’t work. please help

### START CODE HERE ###

# Step 0: call the helper function to create layers for the input encoder
input_encoder = input_encoder_fn(input_vocab_size, d_model, n_encoder_layers)

# Step 0: call the helper function to create layers for the pre-attention decoder
pre_attention_decoder = pre_attention_decoder_fn(mode, target_vocab_size, d_model)

# Step 1: create a serial network
model = tl.Serial( 
    
  # Step 2: copy input tokens and target tokens as they will be needed later.
    tl.Select([input_tokens, target_tokens, input_tokens, target_tokens]),
    
  # Step 3: run input encoder on the input and pre-attention decoder the target.
  input_encoder(input_tokens, pre_attention_decoder),
    
  # Step 4: prepare queries, keys, values and mask for attention.
  prepare_attention_input('PrepareAttentionInput', pre_attention_decoder, n_out=4),
    
  # Step 5: run the AttentionQKV layer
  # nest it inside a Residual layer to add to the pre-attention decoder activations(i.e. queries)
  tl.Residual(tl.AttentionQKV(d_model, n_heads=n_attention_heads, dropout=attention_dropout, mode=None)),
  
  # Step 6: drop attention mask (i.e. index = None
   tl.Select([attention_activations, target_tokens]),
    
  # Step 7: run the rest of the RNN decoder
  [ tl.LSTM() for _ in range(n_decoder_layers)],
    
  # Step 8: prepare output by making it the right size
   tl.Dense(target_vocab_size),
    
  # Step 9: Log-softmax for output
  tl.LogSoftmax()
)

### END CODE HERE

return model

Please don’t post you code on the forum. This is not allowed by the course honor code.
Post the error messages.