# Derivative of "Simplified Cost Function"

In the workbook C1_W3_Lab06_Gradient_Descent_Soln, the instructor explains that w_j and b are updated like this:
w_j = w_j - \alpha \frac{\partial J(\mathbf{w},b)}{\partial w_j}

b = b - \alpha \frac{\partial J(\mathbf{w},b)}{\partial b}

The corresponding code looks pretty much like the calculating gradients for linear regression but with an extra call to the sigmoid function. Note that the original linear regression code assumes that J(w,b) uses the least squares method.

    for i in range(m):
f_wb_i = sigmoid(np.dot(X[i],w) + b)          #(n,)(n,)=scalar
err_i  = f_wb_i  - y[i]                       #scalar
for j in range(n):
dj_dw[j] = dj_dw[j] + err_i * X[i,j]      #scalar
dj_db = dj_db + err_i
dj_dw = dj_dw/m                                   #(n,)
dj_db = dj_db/m                                   #scalar



However the compute_cost_logistic function uses a different cost function which is shown in video " Simplified Cost Function for Logistic Regression". The video does not explain how \frac{\partial{J(w,b)}}{\partial{w}} and \frac{\partial{J(w,b)}}{\partial{b}} look in this case.

Here is the derivative to \frac{\partial{J(w,b)}}{\partial{w}}, it is not equivalent to the code listed above:

Please explain why the example code does not use the actual derivative. What am I missing?

Please refer to the following post for the derivation steps that show the gradients for a linear regression and a logistic regression do look the same.

In particular, if you look at the second table, you will see clearly that the denominator term generated in the Logistic Regression due to the loss function gets canceled out, thanks to the Sigmoid function. After the cancellation, the terms between linear regression and logistic regression no longer look any different.

Cheers,
Raymond