Optional Lab: Logistic Regression

In the optional lab, I noticed
β€œNote, the orange line is β€˜π‘§β€™ or 𝐰⋅𝐱(𝑖)+𝑏 above. It does not match the line in a linear regression model.”

It’s unclear to me why wx + b would be different than what we get in the linear regression model. Isn’t z nothing but the linear regression model?

In logistic regression, you also use the sigmoid() function.

Sigmoid is used on top of the z function, right? So, g(z) is the sigmoid, but z itself should be an affine function and therefore same as what you get via linear regression.

sigmoid() does transform β€˜z’, but I don’t think β€œaffine” is the best description.

For one issue, when you need the gradients of the logistic function, you have to deal with the partial derivative of the sigmoid function. That’s a bit more complicated than just the partial derivative of z = w*x + b (whose derivative is just β€˜w’).