Can a polynomial regressor be an extension to a multiple linear regressor?

I’m trying to implement a polynomial regressor, I’ve already implemented a multiple linear regressor.

As of now, my implementation of the polynomial regressor is that I compute the different powers of the input x, and pass it to a multiple linear regressor, in which x and x^2 (and other powers) are treated as independent features.

This is convenient, but, is it right?

I can see no problem here since we’re taking the derivative only for w and b, and the input x, and the powers of x are treated as constants when computing the gradient.

Is this a valid implementation?

I think it is right and valid! If you look at the “Notes” section from this numpy implementation, and also into the source code of polyfit, you should end up seeing that it is calling this function to create different powers of x according to the degrees required in the model assumption.


1 Like

Great, thanks for the effort.

1 Like

You are welcome @OsamaAhmad!