Word embedding as input

I have a question about word embeddings that I want to make sure I understand correctly:

Once we have our embedding matrix (either trained or transferred), we can use it to obtain embedding vectors - that is, the one-hot vector representation of each word times the embedding matrix.

Do we use this embedding vector as input for the RNN/GRU/LSTM/Transformer or whatever model we want to use?

This is right the embeddings are used as inputs to the NLP model.

1 Like