C5 Week 2 dimensions of embedding matrix

In the lecture we saw the the embedding matrix is of dimension (dim, number_of_word), but in the the second programming assignement, we implemented the embeddign matrix as “The embedding matrix has a row for each unique word in the vocabulary”, did I miss something?

Hi @Sofiane,

A Row in an embedding matrix is the encoding of a word to a vector of dim dimensions. So each row corresponds to a word in the vocabulary. You can follow this article for a further elaboration: Embeddings: A Matrix of Meaning. By Adam Schwab | by Petuum, Inc. | Medium

@yanivh
Thank you for the answer, in the lecture the columns are representing the word vectors.

HI @yanivh,
Could you help me with this one?

Hi @Sofiane,

I understand that in the lecture you saw that word embeddings are the columns of the embedding matrix. However, the common convention I am familiar with is the transpose of it. This is also what is presented in the assignment. I suggest you continue with what that is defined in the assignment. After all the difference between the two conventions is that one is the transpose of the other.