Cost function for Polynomial regression

Is the cost function for polynomial regression a squared error cost function?

if yes, then while calculating the gradient descent the derivative term for higher degree polynomials will take a different form than the usual one mentioned in the course. Am i right on this?

Hello @Thala

The important thing to note here is that we take the derivative of Cost J w.rt w\frac {\partial J} {dw} and NOT w.rt x\frac {\partial J} {dx}. Hence the polynomials are all constants as far as the derivative w.rt w is concerned.

So the expression for \frac {\partial J} {dw_j}\sum_{i=1}^{m} (f_{w,b}(x^{(i)}) - y^{(i)}).x_j^{(i)} still holds even when we have polynomial terms.

If x_j happens to be x^2 (say), so be it. We only need to provide the values that the so called x^2 term takes up for each sample (i) to be able to evaluate the expression for \frac {\partial J} {dw_j}