In the video “Logistic Regression Gradient Descent” the formula for ‘z’ is defined as “z = wTx + b” but for the inputs for ‘z’ he sets ‘x1’, ‘w1’, ‘x2’, ‘w2’ and ‘b’ and the formula for z gets changed to “z = w1x1 + w2x2 + b”.
How and why does this change happen and what are w2 and x2 supposed to be?
I do not think this has even been mentioned up until this point of the course.
Would be thankful if someone could help me with this.
Right! If w = (w_1, w_2) and x = (x_1, x_2) are both vectors with two entries, then Prof Ng is just writing out what the vector version of the formula means in terms of the components of the vectors. This is the way to state it with the vectors:
z = w^T \cdot x + b
Which then gives us
z = w_1 * x_1 + w_2 * x_2 + b
if you simply write out what the dot product means.