Asking about derivative of w[j] in gradient descent for logistic regression

Hi everyone,
I just learned about Gradient Descent for logistic regression and i’m confusing about the derivative of w[j]. Althought i knew the different between gradient descent for linear regression and logistic regression was definition of f_wb(x), i can’t understand why there is still x[j] when we update w[j]. The definition of f_wb(x) has been changed there so i wonder why x[j] still there

i tried some math operator to prove it but i didnt get correct answer.
I will very happy if i know the answer for this.

Hello @Phan_Phuoc, please check this discussion out for some hints of deriving the formula.


1 Like

Thank you very much. Now i has learned a new trick in calculus.

1 Like