Gradient for Logistic Regression- Error term

Good Day,

Could you please help me understand how and why we get the error in calculating gradient for logistic regression for derivative of w

for i in range(m):
        f_wb_i = sigmoid(np.dot(X[i],w) + b)          #(n,)(n,)=scalar
        err_i  = f_wb_i  - y[i]                       #scalar
        for j in range(n):
            dj_dw[j] = dj_dw[j] + err_i * X[i,j]      #scalar
        dj_db = dj_db + err_i

HI @vinay6

First please make sure from indentation it is correct but I think you made a great space(tab) specially in the first loop(outer nested loop), Second before the return and after the loop you want to divide the dj_db, and dj_dw by m

Note the code isn’t allowed to be shared here

Cheers!
Abdelrahman

Good Day,

Apologies, but this is not a code which I have written, it is from the practice lab session which I have used to just mention about the error term I wanted to know

Mathematically, the error term comes from computing the partial derivative of the cost equation.

FYI, that code is provided in the ungraded Week 3 lab notebook: “C1_W3_Lab06_Gradient_Descent_Soln”