Vectorization in Numpy

Dr. Ng said that we can calculate wTx by np.dot(wT,x).
Dot product for two vectors is this right - x.y = xTy
Shouldn’t this be np.dot(w,x) - this would be the correct notation I guess. Can someone clarify on this?

Please see this keeping array shapes in mind.

I looked at it, I do know programming and MATLAB as well, but my question is a different thing and this website about dot function is different as well. I did not get what were you trying to me look at. can you please elaborate on that or answer my question straightaway.

Python is not as nice as MATLAB at handling vector operations. In MATLAB, all that is native to the language and the way that the polymorphism works is really beautiful. But in python, they had to build a separate library numpy to provide vector functions. In numpy you have two functions you can use for real dot product style matrix multiply: numpy dot or matmul. With 2D objects, the two are equivalent. Numpy also provides “elementwise” matrix multiply as numpy multiply. You also have “over-ride” operators that you can use: * is equivalent to np.multiply and @ is equivalent to np.matmul.

In the particular case of logistic regression, we have w as a column vector of dimension n_x x 1 and X (the input “samples” matrix) has dimension n_x x m, where n_x is the number of features and m is the number of samples. So if we need the dot product of w and X, it will require that we transpose w first, right? That’s exactly the same in MATLAB. It’s just the underlying math. The math expression is w^T \cdot X and you have several way to express that in python, all of which are equivalent:

np.dot(w.T, X)
np.matmul(w.T, X)
w.T @ X
np.transpose(w) @ X
np.dot(np.transpose(w), X)

Of those choices, Prof Ng will usually choose the first one. If you are just getting started here, it is also helpful to be cognizant of the notation conventions Prof Ng uses, which are discussed on this thread.