I’m trying to derive the gradient descent formula for logestic regression , which in lecture will be the same as gradient descent formula for linear regression except f_{w,b}(x^{(i)})=\frac{1}{(1+e^{- (\vec w . \vec x +b)}))}, so i’m trying carfuly to setup the partial derivative to get \frac{\partial J(\vec w,b)}{\partial w_j}, but i cannot complete it as the expression become more complex

1 Like

Hello @eslam_shaheen

Welcome to the community.

It becomes a lot easier if you apply the chain rule.

We have:

z = w.x +b

f(z) = \frac {1} {1+e^{-z}}

Loss^{(i)}= -y^{(i)} \log(f) - ( 1- y^{(i)}) \log(1- f)

J = \frac {1} {m} \sum_{i=1}^m Loss^{(i)}

Now apply the Chain Rule and it becomes pretty straightforward. Take a look here

2 Likes