# A doubt in C1_W3 Lecture

Could you please guide me on this issue?

May i know the reason of this sum term is excluded? It is supposed to be there after being calculated as a derivative term.

The picture is taken from this learning material:

Supervised Machine Learning: Regression and Classification
→ Week 3
→ Regularized linear regression
→ Time: 8.01 / 8:52

Thank you.

It is because that is the derivative of the cost J with a constant, and that derivative is 0.

Dear Mr Gent,

From the attached picture above, by convention the differentiation of term should be like this

but the summation

is excluded in the calculation as shown in the lecture slide.

If the derivative of that term is 0 because of constant value, may i know why it shows this result of differentiation?

Thank you.

@JJaassoonn, That summation should not be included because we are taking the partial derivative of cost J with respect to a single parameter w_j while keeping all other variables constant.

Here’s an example:
Let j range from 1 to 3, and let’s calculate the partial derivative of the following equation with respect to only w_2, where j=2 :

\frac{\partial}{\partial w_2} = \frac{\lambda}{2m} [w_1^2 + w_2^2 + w_3^2], where \sum\limits_{j=1}^{3}w_j^2=[w_1^2 + w_2^2 + w_3^2]

The derivative of w_2^2 is 2w_2, and since w_1^2 and w_3^2 do not include the variable w_2^2, they are considered constant, with a derivative of 0.

Thus, the equation simplifies to:

\frac{\partial}{\partial w_2} = \frac{\lambda}{2m}[0 + 2w_2 + 0],

\frac{\partial}{\partial w_2} = \frac{\lambda}{2m} 2w_2,

\frac{\partial}{\partial w_2} = \frac{\lambda}{m}w_2, or \frac{\partial}{\partial w_j} = \frac{\lambda}{m}w_j.

And that’s what Gent is meant to say!

1 Like

Mr Mujassim_Jamal, Thank you so much for your kind explanation.

Mr Gent, Thanks for your guidance too.

1 Like