# Multiple Linear Regression - Gradient Descent formula

On the right hand side (n features), why has the first ‘x’ term changed to the vector x, but the second x term is x1? Why is one of the x terms the vector and another one a value within that vector?

Hello @Robert_Whiteley

Welcome to the community.

We have:

w_1 = w_1 - \alpha .\frac{\partial J}{\partial w_{1}}

where
\frac{\partial J}{\partial w_{1}} =\frac{1}{m} \sum_{i=1}^m \lgroup w_{1}.x_1^{(i)} + w_{2}.x_2^{(i)}+...+ w_{n}.x_n^{(i)} +b-y^{(i)}\rgroup.x_1^{(i)}

\frac{\partial J}{\partial w_{1}} =\frac{1}{m} \sum_{i=1}^m \lgroup\sum_{j=1}^n \lgroup w_{j}.x_j^{(i)}+b\rgroup-y^{(i)}\rgroup.x_1^{(i)}

\frac{\partial J}{\partial w_{1}} = \frac {1} {m} \sum_{i=1}^m (f_{w,b}(\vec{x}^{(i)}) - y^{(i)}).x_1^{(i)}

Where
\sum_{j=1}^n \lgroup w_{j}.x_j^{(i)}+b\rgroup = f_{w,b}(\vec{x}^{(i)})

As you can see here, when we are evaluating \frac{\partial J}{\partial w_{1}} we only multiply it with x_1^{(i)} and not (x_1^{(i)}, x_2^{(i)},..,x_n^{(i)})

Likewise, when we are evaluating \frac{\partial J}{\partial w_{2}}

\frac{\partial J}{\partial w_{2}} =\frac{1}{m} \sum_{i=1}^m \lgroup\sum_{j=1}^n \lgroup w_{j}.x_j^{(i)}+b\rgroup-y^{(i)}\rgroup.x_2^{(i)}

\frac{\partial J}{\partial w_{2}} = \frac {1} {m} \sum_{i=1}^m (f_{w,b}(\vec{x}^{(i)}) - y^{(i)}).x_2^{(i)}

Where
\sum_{j=1}^n \lgroup w_{j}.x_j^{(i)}+b\rgroup = f_{w,b}(\vec{x}^{(i)})

So, in the general case \frac{\partial J}{\partial w_{n}} = \frac {1} {m} \sum_{i=1}^m (f_{w,b}(\vec{x}^{(i)}) - y^{(i)}).x_n^{(i)}

So, it’s because we are differentiating with respect to w1, which means that all of the other wj terms are constants and so they go to zero when you differentiate? And the w1x1 term differentiates to x1?

I think I’ve got it, if that’s correct.

Deleted my reply. Did not read all the details of the equation.

@Robert_Whiteley

That is correct.

1 Like

Thank you. I think I worked it out just before I saw your answer, but that helped to cement it.

1 Like

You are most welcome!