Course 5 - Transformer question about the layers in the EncoderLayer

Hello,

In the EncoderLayer (from Transformer assignment) following Keras’ Layers are used:
MultiHeadAttention, FullyConnected, LayerNormalization (2 times) and Dropout → From Keras point of view there are 5 Layers, isn’t?

But the unit test Encoder_test(Encoder) passes 2 as num_Layers → What is considered Layer from this implementation point of view? Just MultiHeadAttention and FullyConnected?

Could you explain the “for” below:
self.enc_layers = [EncoderLayer(embedding_dim=self.embedding_dim,
num_heads=num_heads,
fully_connected_dim=fully_connected_dim,
dropout_rate=dropout_rate,
layernorm_eps=layernorm_eps)
for _ in range(self.num_layers)]

Thanks in advance,

num_layers refers to the number of EncoderLayer instances. The for loop you’ve highlighted stacks encoder layers. As part of the __call__ definition, you should invoke these enc_layers sequentially.