Gradient Descent formula for logestic regression

Hello @eslam_shaheen

Welcome to the community.

It becomes a lot easier if you apply the chain rule.

We have:
z = w.x +b
f(z) = \frac {1} {1+e^{-z}}
Loss^{(i)}= -y^{(i)} \log(f) - ( 1- y^{(i)}) \log(1- f)
J = \frac {1} {m} \sum_{i=1}^m Loss^{(i)}

Now apply the Chain Rule and it becomes pretty straightforward. Take a look here

2 Likes