Logistic regression output function

In equation y^=sigmod(wtX+b) , how was the requirement for parameter w and b discovered? Why not just use sigmod(X) to get y^?

Hello Rithvik_M_Ballal,
Welcome to the Discourse community. I will do my best to reply to your question. If you have any additional issues please feel free to reply with your issue.

Using sigmoid(X) to get y^ would not be appropriate because it would not take into account the relationship between the input features and the outcome variable. The purpose of logistic regression is to learn a model that can capture this relationship and make accurate predictions based on new input data. The parameters w and b allow the model to learn the relationship between the input features and the outcome variable and make predictions based on this relationship. The parameters w and b are learned during the training process by minimizing a loss function that measures the difference between the predicted probabilities and the true labels. By adjusting the values of w and b, the model can learn to make more accurate predictions

I am here to help you to the best of my abilities. Please feel free to reply back if you have a followup question.
Can Koz

What sort of model would you have if sigmoid(X) did not happen to equal y? You have to allow for the model to learn to match the ‘y’ values. That’s what w and b are for.

Thanks a lot for your reply. That makes sense.