In the routine Emojify_V2, what should be the number of units in the first LSTM?
would that be input_shape[0]?
embeddings = embedding_layer(sentence_indices)
X = LSTM(units = input_shape[0], return_sequences= True)(embeddings)
My test fails with the following error:
Test failed
Expected value
[‘LSTM’, (None, 4, 128), 67072, (None, 4, 2), ‘tanh’, True]
does not match the input value:
[‘LSTM’, (None, 4, 4), 112, (None, 4, 2), ‘tanh’, True]
I feel it’s because of the # of units in the first LSTM. The instructions say 1 unit for the second LSTM further down.
LSTM is not the first layer of this model.
As you see, an input is handled by “Input” layer at the top. This layer takes care of “input_shape”.
LSTM layer gets the output from embedding, not an input directly. So, LSTM layer does not need to know the shape of inputs.
Regarding LSTM unit number, there are in-line comment in your assignment notebook.
For the first LSTM, you will find,
# Propagate the embeddings through an LSTM layer with 128-dimensional hidden state
# The returned output should be a batch of sequences.
For the 2nd LSTM, you will also find,
# Propagate X trough another LSTM layer with 128-dimensional hidden state
# The returned output should be a single hidden state, not a batch of sequences.
In both cases, the first sentence is to define a unit number, and the 2nd sentence is to set a parameter for “return_sequence”.
Hope this clarifies.
Thanks for clarification. It works