According to optional lab2 in week3,
"The formula for a sigmoid function is as follows -
𝑔(𝑧)=1/(1+𝑒−𝑧(1))
In the case of logistic regression, z (the input to the sigmoid function), is the output of a linear regression model. "

I don’t really understand why the input z is the output of a linear regression model rather than the feature vector itself.

edit:
After going through the entire lab, I can see that the input to the logistic function is a feature vector but i still don’t get the implications of the above statement
I’ve also gone through the relevant lecture, where the output of the linear regression is being fed as the input to logistic regression…which only confuses me more

Listen in this example we want our output to be between 0 and 1. And from the linear regression model we could get any infinite set of values, but since we only want the answer to be either 1 or 0 which is the case for classification model, we took that value of ‘y’ (it is w*x + b only) and put it in the sigmoid function to reduce the set of output values to 0 or 1. Taking z as x is going to do nothing as then we are just applying a math formula and not including any parameter from which our model will actually learn. Hope your doubt is resolved now

Thank you for explaining in detail, I’ve understood the need for the logistic function. I was just wondering about the significance behind this statement,
"In the case of logistic regression, z (the input to the sigmoid function), is the output of a linear regression model. "

Why are we using a linear function as an input to the logistic function rather than any other polynomial functions(which we see further in week3) , is it purely due to the training data we’ve considered ?

Even polynomial functions are really linear, since the only thing that is “polynomial” about them is how you combine the features ‘x’ into new features.

Once you create a new feature, it’s just a real number. There’s nothing specifically polynomial about it, and it’s still linear regression (and then in this discussion, you still apply sigmoid() to it).