Thank you for your quick response! (This is something I love about DeepLearning.AI courses.)
yes, I looked at the models in three ungraded labs. The have the structures listed (copied and pasted for easy reference) below. As you can see none of them include the following combination suggested in the assignment. When I add Conv1D and GlobalMaxPooling1D between the pre-listed embedding layer and a bidirectional (LSTM) layer, I get the error:
Input 0 of layer "bidirectional_2" is incompatible with the layer: expected ndim=3, found ndim=2. Full shape received: (None, 128)
-
Conv1D
Dropout
GlobalMaxPooling1D
MaxPooling1D
LSTM
Bidirectional(LSTM)
First ungraded lab
model = tf.keras.Sequential([
tf.keras.layers.Embedding(tokenizer.vocab_size, embedding_dim),
tf.keras.layers.Bidirectional(tf.keras.layers.LSTM(lstm_dim)),
tf.keras.layers.Dense(dense_dim, activation='relu'),
tf.keras.layers.Dense(1, activation='sigmoid')
])
Second ungraded lab
model = tf.keras.Sequential([
tf.keras.layers.Embedding(tokenizer.vocab_size, embedding_dim),
tf.keras.layers.Bidirectional(tf.keras.layers.LSTM(lstm1_dim, return_sequences=True)),
tf.keras.layers.Bidirectional(tf.keras.layers.LSTM(lstm2_dim)),
tf.keras.layers.Dense(dense_dim, activation='relu'),
tf.keras.layers.Dense(1, activation='sigmoid')
])
Third ungraded lab
model = tf.keras.Sequential([
tf.keras.layers.Embedding(tokenizer.vocab_size, embedding_dim),
tf.keras.layers.Conv1D(filters=filters, kernel_size=kernel_size, activation='relu'),
tf.keras.layers.GlobalMaxPooling1D(),
tf.keras.layers.Dense(dense_dim, activation='relu'),
tf.keras.layers.Dense(1, activation='sigmoid')
])