Week3: Derivations of J(w,b) for sigmoid function equal to quadratic linear function?

Hi, it ist stated in Week 3 , Video 1 that logistic gradient descents looks the same linke linear gradient descent, only with a different function f: dJ/dw = Sum[(f(x)-y)x]
For the linear Cost Function J(w,b) = Sum[((w
x+b)-y)^2], I understand that J’ =2*((wx+b)-y)x.
(This is chain rule with outer function u(x)^2 → 2
u(x) and inner function v(x) wx+b → x)

Please help me to understand: How can the new logistic J-function get derivated to the same result (f(x)-y)x?
J for logistic regression is Sum[y
log(1/(1+e^(wx+b)))+(1-y)log(1-(1/(1+e^(wx+b))))] and when I derivate it, there’s a couple of e^… in several fractions, sums and with different exponent.
→ I can’t reduce that derivation to the simple term of (f(x)-y)*x.
→ Is there a link where I can see that transformation? Or am I completely misunderstanding the topic?

Thank you a lot in advance!

Check this out.

Wow, great!
Thank you.