W2 A2 about GloVe advantage and disadvantage

I noticed in W2 A2 one bullet point of You’ll be able to said “Explain the advantages and disadvantages of the GloVe algorithm”, I had went through the notebook but didn’t find the detail of it, both the 2 Emojify algorithms applied the GloVe map but neither talked about the word embedding algorithm advantage and disadvantage. Did I missed anything?


  1. Taking average of word vectors to represent a the entire sentence yields pretty decent results with just 1 sigmoid neuron and a small training set.

Disadvantage (from the notebook):

Word Ordering isn’t Considered in this Model

  • Note that the model doesn’t get the following sentence correct:

“not feeling happy”

  • This algorithm ignores word ordering, so is not good at understanding phrases like “not happy.”

Word embedding combined with recurrent layer(s) considers position of words in a sentence and yields better results when compared to the 1st approach of averaging word embedding that make up a sentence.

Hey @balaji.ambresh

Seems it’s the discussion of the 2 models introduced in the notebook, did you mean that the disadvantage of model 1 reflecting that for GloVec itself and that’s the reason for introducing RNN? But I don’t think any word embedding algorithm can address the order of the word presenting in a sentence since I even don’t think it’s word embedding’s responsibility for that, it’s just trying to vectorize a word in a space and hopefully the meaningful dimension of the word in space reflecting some of the nature of it for follow up algorithm(application) can take advantage of.

Let me know if there is a misunderstanding.

That is correct.

As you rightly understood, embedding algorithms like Glove can understand similarity between words and lack of ability to consider the position of each word in the input sentence. Combining them with recurrence based architectures gets around this limitation.