What is this t here? why we are doing its dot product with X?

Like here you can see in the image C1 W2 andrew ng even say that dot product of x and w can anyone help me with that??

T is the transpose.

To multiply two matrices, the number of columns in the first matrix must match the number of rows in the second matrix.

1 Like

The ‘T’ stands for a matrix transpose.

So if this were W:

\begin{bmatrix} 1 & 2 \\ 3 & 4 \end{bmatrix}

This is its transpose (T):

\begin{bmatrix} 1 & 3\\ 2 & 4 \end{bmatrix}

In order to take the dot product of the weights and the data X, the two matrices have to be of the right dimensions. So the number of columns of the first matrix must be the same as the number of rows of the second matrix.

1 Like

Sorry For Follow Up Question…

But One More Doubt, Like Why Doing Transpose Of W Make It Right Dimension As Same As X?? Like How??

Sorry For Follow Up Question…

But One More Doubt, Like Why Doing Transpose Of W Make It Right Dimension As Same As X?? Like How??

Hey @saifkhanengr or others-- Something has come up and I suddenly need to step out for a bit… Can someone help him on this ? It is confusing @saifkhanengr because I only gave you an example of a 2 x 2 matrix. It will be more clear if you see the transpose of a matrix of a different size.

You have to work out what the dimensions are. Prof Ng has defined the weight vector w to be a column vector. It has n_x elements, where that is the number of features in each input vector. So the dimensions of w are n_x x 1. Then x is one input sample and it is also a column vector of dimension n_x x 1. So you can’t do the dot product without the transpose, because the “inner” dimensions need to agree:

n_x x 1 dot n_x x 1 does not work.

So if we transpose w, then w^T is a row vector with dimensions 1 x n_x. Now it works:

1 x n_x dotted with n_x x 1 gives us a 1 x 1 answer, which is equivalent to a scalar.

Note that you could also do the dot product in the other order:

n_x x 1 dotted with 1 x n_x, but that gives a result that is n_x x n_x and has nothing to do with what we are trying to achieve here. Here’s a thread which demonstrates why.

The higher level point here is that we are always doing is starting with a mathematical formula. Then we need to figure out how to express that with linear algebra operations. Then as the final step, we need to figure out to express those linear algebra operations in python.

3 Likes

Finally Got It :_) Thank You Soo Much Sir!!

Great! The other thing worth pointing out is that the way Prof Ng has defined everything here is just his choice. He could have defined w as a row vector, but he chooses the convention that all standalone vectors are column vectors.

Pay careful attention when we get to next week and the weights become matrices, instead of vectors. There Prof Ng chooses to define the weight matrices so that the transpose is no longer required.

1 Like

Prof Ng starts by defining w and x as a column vectors so w.transpose is used for calculating z as the matrix multiplication of w.t and x. Later he dropped the transpose and framed W to directly do matrix multiplication with X. This confused me in one of the Quiz. Better to avoid the w.t in the early lessons altogether since the weights are arranged to be used directly for calculating z in subsequent lessons.

1 Like

Well, I think the deal is Prof. Ng presumes we know nothing about linear algebra, and wants to make things easier on new students, and I laud that.

I mean, IMHO, if you feel up to it, you should take one of the MITx related classes.

It was only offered, I think once, but I took their Computation Structures (I think it was 6.004x) series classes, where we went from literal ‘designing our TTL’ level system up to building a processor and establishing our own assembler and OS.

But there they give you no hints, whatsoever, and you get maybe like 3 chances ?

I learned so much from that course, yet what learning style suits you is your choice.

2 Likes

I agree, your solution cleared up a lot of questions in my mind. When I first looked at it, I got confused because I didn’t realize we were actually not at the point of building nets yet.