Gradient Descent for Logistic Regression: Need Help with intuition

In the Lecture: It show that the Value of del(J)/del(w) is :

This value of J was derived from Linear Regression where f__wb(x) = w.x + b

But for Logistic regression f__wb(x) = 1/1+[e^(w.x+b)]
So how come the values of partial differentiation remain the same when the expression is different?

Is that a property of exponents or Am I missing something?

It is a happy outcome of computing the partial derivative of the logistic cost function, when the sigmoid is used to compute f_wb.

It turns out to be the same form as for linear regression.