Negative sampling (word2vec) analogous to dropout in basic NN?

Hi during the lecture, I thought that negative sampling is very similar to dropout in previous lecture (although the aim is quite different). The aim in Word2Vec was said to be computational reason, while the dropout has been done for regularization technique. Does this analogue make sense?

1 Like

Good morning Masaki,

Sorry for the late reply but let me share my thoughts here. Yes, there is some kind of similarity in terms of not using all available samples for Word2Vec and not using all available Neurons for Dropout in NeuralNetworks. However the implementation and mathematics behind these techniques are different so be careful by saying they are the same. Finally, I share with you this article explaining very well Negative Sampling and the reasoning behind.

Happy learning,