Need Help to understand logistic regression concept cost function

here how can we apply wj and b similar to linear regression?

earlier we learn that the graph of below

here we can clearly see that graph is not monotonic to make monotonic we have calcuate cost function of logistic regression using log see below:

so why are we not differentiating the cost function which is created using log ? please help me to understand if you need more clearification about question just tell i will help you to understand the question clearly

1 Like

From what I understand: the cost function when calculated it is calculated using log for logistic regression, not just a linear value! And is differentiated with the log calculation!

1 Like

I tried searching using chatgpt and come to know what you are saying

thank you

1 Like

Same question here! It made me confuse the fact that the cost function for Logistic Regression is presented, but then, for the Logistic Regression Gradient Descent this same cost function is not used at all :face_with_spiral_eyes:

I hope you have got the answer and this differentiation is come after differentiating log function itself only both looks similar but f_wb(x) is different in both cases