Hi ,

In lecture Andrew says that cosine similarity of 2 vectors u,v is `u.T * v / |u|*|v|`

but in the linear algebra course I learned that cosine similarity is `u*v / |u|*|v|`

which of these formulas is the right?

Hi ,

In lecture Andrew says that cosine similarity of 2 vectors u,v is `u.T * v / |u|*|v|`

but in the linear algebra course I learned that cosine similarity is `u*v / |u|*|v|`

which of these formulas is the right?

They are the same, although I don’t think you rendered the first one correctly. Prof Ng uses * to mean “elementwise” product, but that is a dot product in that formula. It’s just a notation issue. In the C5 W2 A1 assignment, they show the formula in mathematical form as:

CosineSimilarity(u, v) = \displaystyle \frac {u \cdot v}{||u||_2||v||_2}

That is the same as the formulation you show from the Linear Algebra course. But if you then express that in numpy code and the vectors u and v are column vectors, then the numerator would be u^T \cdot v or `np.dot(u.T, v)`

.

Whether you need to transpose depends on the shape of the vectors. The math is a dot product. The transpose is really an implementation detail.

Hello @Pavel_Grobov

I would put it this way:

To begin with, `.T`

means the Transpose operation.

Then, when I learned linear algebra, I learned about vectors and matrices. For vectors, we have the dot product. For matrices, we have the matrix multiplication and the transpose. The transpose is for matrices.

When we dotted two vectors, we wrote {\bf a} \cdot {\bf b} and cared not their orientations because there is only one way to dot two vectors up.

When we learned matrix, we brought our understanding of a vector to a higher level where we knew there were two possible variants: a row vector (a 1-row matrix) and a column vector (a 1-column matrix).

Now, the orientations matter when we multiply two matrices up, and we have the Transpose operation (`.T`

) introduced for matrices.

If we have two row vectors, in our primary understanding of vector algebra, we can write it as {\bf a} \cdot {\bf b}. However, if we have two 1-row matrices, we write it as {\bf a}{\bf b}^T. In other words, {\bf a} \cdot {\bf b} = {\bf a}{\bf b}^T, or that the dot product of 2 vectors are equivalent to the matrix multiplication of a 1-row matrix and the Transpose of another 1-row matrix.

Similarly, if we have two column vectors, we have {\bf a} \cdot {\bf b} = {\bf a}^T {\bf b}. (Note that the \cdot symbol is used exclusively for vector-vector dot product but not for matrix-matrix multiplication)

Therefore, I think whether or not you have the Transpose operation there depends on the context. On text books or on slides, we can freely switch between contexts to support our use of symbols. However, when coding, we almost always represents a vector as either a 1-row or a 1-column matrix, and such representation fix our context to the matrix context where we cannot miss out the Transpose operator.

Cheers,

Raymond

Now I get it!

Thank you

Thanks for the answer!