What is the archetecture for LSTM model in C3_W3_Lab_2_multiple_layer_LSTM.ipynb

What is the architecture for the LSTM model in C3_W3_Lab_2_multiple_layer_LSTM.ipynb,

In Build and Compile the Model part, lstm_dim = 8, does that mean Ty =8?
timesteps = 20, does that mean Tx=20?

I read the RNN from Andrew NG, still not understand the architecture of the model you used here.

Thanks

Define LSTM that returns a single output

lstm = tf.keras.layers.LSTM(lstm_dim)
result = lstm(random_input)
print(f’shape of lstm output(return_sequences=False): {result.shape}')

Define LSTM that returns a sequence

lstm_rs = tf.keras.layers.LSTM(lstm_dim, return_sequences=True)
result = lstm_rs(random_input)
print(f’shape of lstm output(return_sequences=True): {result.shape}')

Those 2 LSTM layers have the same output units. Layer 1 have units from a[1] --a[8], layer 2 also have units from a[1] --a[8].

But in the following code, layer 1 and layer 2 LSTM have different output unit numbers? layer 1 have 64 LSTM units, layer 2 have 32 LSTM units. How is the architecture? do you drop half of the layer 1 units when connect to layer 2 LSTM unit?

Build the model

model = tf.keras.Sequential([
tf.keras.layers.Embedding(tokenizer.vocab_size, embedding_dim),
tf.keras.layers.Bidirectional(tf.keras.layers.LSTM(lstm1_dim, return_sequences=True)),
tf.keras.layers.Bidirectional(tf.keras.layers.LSTM(lstm2_dim)),
tf.keras.layers.Dense(dense_dim, activation=‘relu’),
tf.keras.layers.Dense(1, activation=‘sigmoid’)
])