Currently I’m in Gradient Descent Implementation video of Week 3.
When you updated the gradient descent algorithm with the logistic regression models of z, perhaps it is overriden that the cost function includes this big formula with ‘logs’ that is not shown inside the sum of the complete calculation.
Even if we talk about the g(z) formula, is this equivalent with the cost and loss log functions at the beginning? Had we said that somewhere in the course previously, or is there something I do not understand? Exponent expressions shouldn’t be equal to logarithmic loss functions and logarithmic cost functions.
At the time mark 1:43 when it calculates the partial derivative of the cost function (based on the gradient descent algorithm) there are no logarithms inside the sum.
Essentially it is about how to derive the gradients.
l = -y\log{p} - (1-y)\log{(1-p)} p = \frac{1}{1+\exp{(-z)}} z = wx +b
because \frac{\partial{l}}{\partial{w}} = \frac{\partial{l}}{\partial{p}} \frac{\partial{p}}{\partial{z}} \frac{\partial{z}}{\partial{w}}, so we only need to derive each of the term separately,
The \log is gone when we differentiate the loss with respect to p. You can recover the formula in the slide BUT for one sample if you multiply the above 3 terms together, and replace p with f(x). You recover the full formula if you put the summation and 1/m back in.