In the Emojify assignment for Sequence Model - W2, the assignment mentions “negative sampling” under “What you’ll build” and “Congratulations”.
However, the assignment didn’t actually cover any negative sampling (it was only training the LSTM, but not any of the the word embedding). I think it might be something left over from a previous version of the assignment, and might be worth removing for the sake of correctness.
PS. Andrew actually mentions that the assignment would cover negative sampling in the lecture video, but I didn’t see it either.
Andrew mentions it in the “Negative Sampling” video for the week, around the 8:30 mark.
The transcript is as follows:
“Which is relatively cheap to do on every iteration rather than updating a 10,000 way Softmax classifier. So you get to play with this algorithm in the problem exercise for this week as well. So this technique is called negative sampling because what you’re doing is, you have a positive example, the orange and then juice.”
The transcript says “problem exercise”, but I hear “programming exercise” instead.