I have the following code:
def fit_tokenizer(train_sentences, num_words, oov_token):
tokenizer = Tokenizer(num_words=num_words, oov_token=oov_token)
When I assign the units=5 for my last Dense layer as follows:
model = tf.keras.Sequential([
tf.keras.layers.Embedding(num_words, embedding_dim, input_length=maxlen),
tf.keras.layers.GlobalAveragePooling1D(),
tf.keras.layers.Dense(units=5, activation='softmax')
])
Then I obtain the following error:
However, when I change it units=1
in my Dense layer, it starts training (it is incorrect I guess?!!).
What am I doing wrong?
Cheers,