Polynomial Regression issues such as memory issue due to parameter redundancy when performing polynomial multiplication for features
As you can see, when I convert X to a quadratic multiple regression with Polynomialfeatures, I run into memory problems due to the huge increase in the number of parameters. I run into such a problem when I only want a 2nd degree polynomial, or if I want more 2nd degree polynomials this will create a bigger problem. Is there no alternative solution for Polynomial Regression memory overflow? If so, what? What is the most logical thing to do when I want to do a polynomial regression, how should I optimize it?
Creating new polynomial features always increases the size of the data set. There is no avoiding that.
However, if you’re using a neural network, typically you do not need to create additional polynomial terms. This is because each hidden layer contains a non-linear activation function (relu, sigmoid, etc). This allows each layer to create new non-linear combinations of the input features.
Polynomial features are typically only used in linear or logistic regression.
In the evaluating model found in the first week, it was mentioned which degree we would choose a polynomial function, so I tried to run Polynomial Regression and encountered such errors. What was the purpose of this lesson then?
Polynomial regression is a useful tool, but it’s only feasible to use on datasets that have a small number of features.