Week 2: w1 and w2 as inputs for logistic regression - Gradient Descent

In the video “Logistic Regression Gradient Descent” the formula for ‘z’ is defined as “z = wTx + b” but for the inputs for ‘z’ he sets ‘x1’, ‘w1’, ‘x2’, ‘w2’ and ‘b’ and the formula for z gets changed to “z = w1x1 + w2x2 + b”.
How and why does this change happen and what are w2 and x2 supposed to be?

I do not think this has even been mentioned up until this point of the course.
Would be thankful if someone could help me with this.

image
image

Thank you

X is a matrix. It’s rows and columns are the feature values for each example.

‘w’ is a vector. It can have up to ‘n’ elements, where ‘n’ is the number of features in the dataset.

w1, w2, etc. are the values of individual elements of the ‘w’ vector.

For each feature in x, there is a weight in the ‘w’ vector.

1 Like

Right! If w = (w_1, w_2) and x = (x_1, x_2) are both vectors with two entries, then Prof Ng is just writing out what the vector version of the formula means in terms of the components of the vectors. This is the way to state it with the vectors:

z = w^T \cdot x + b

Which then gives us

z = w_1 * x_1 + w_2 * x_2 + b

if you simply write out what the dot product means.

2 Likes

Or this:

1 Like