Hello can we apply gradient decent on higher order polynomial? because the cost function for this became non convex and has multiple local minima.

Yes, @GAURAVKUMAR_PRAVINBH.

Firstly, it is false that gradient descent can only be applied to a linear problem or a convex problem. The bad thing about a non-convex problem is just that there is no guarantee to the global minimum, but we can’t say gradient descent can’t be applied to such case.

Moreover, if we want to, we only need to create new features for each polynomial term to reduce a higher order problem to a linear problem. Check the following out:

Cheers,

Raymond

No, that is not the case. Do not confuse the shape of the f_wb curve with the shape of the cost function.

The linear regression cost function is always quadratic (by definition), as it is based on the sum of the squares of the errors. It is always convex.