Hi, I believe the equation of the cosine (used for cosine similarity) in the lecture video ‘properties of word embeddings’, as well as in the lecture notes, incorrectly contains a transposed vector in the numerator. The product of the two vectors should be an inner product, thus none of the two vectors should be transposed. Thanks
I believe you’re referring to this part of the lecture:
Whether a transposition is required depends on the notation being used and whether vectors can have more than one shape. That’s an implementation detail.
Andrew has a long-standing habit in his lectures of using “u transpose v” to denote a dot product. This dates way back to when this course used MATLAB, which allows for vectors to be either columns or rows. His standard was to use column vectors, so “u transpose v” works dimensionally as (1 x n) * (n x 1).
If you fast-forward to this week’s programming assignment, you see that it uses the more traditional dot-product notation.
Thanks TMosh, agreed.