Gradient function for regularized linear regression

Can someone explain me this code, I am not understanding
GD eq
In equation alpha* 1/m summation i = 0 - m (f_wb(X[i]) - y[i])*x_i is being subtracted from w_j but in code it does not show this term is being subtracted

The code you quoted just computes the regularized gradients for logistic regression.

It doesn’t modify the weight values. That’s in a different function.

now i get it, Thank you TMosh :+1:

1 Like