I studied week 2 in the course of nlp and I am confused about word embedding word2vec and Lstm
model , what I understand is word embedding make the word more relatable to each other so do we use word embedding before making LSTM models ?
Hello @Moustafa_hussein,
I think the basic answer is “No, we do not have to”, because you can train word embedding with LSTM models, so it is not that the LSTM has to be based on trained word embedding. If we look at this result from the word2vec paper:
This is just one of the many result tables. RNNLM represents Recurrent Neural Network Language Model, and it was used to build word embeddings. The paper didn’t test LSTM, but we know LSTM is a type of RNN, so we can actually also use LSTM to train a set of word embeddings.
Cheers,
Raymond