C1_W2_Lab04: getting the correct values of polynomial coefficients

Greetings!

For me it was exiting to know that linear regression can be used for non-linear fitting by feature engineering and scaling.
Great trick!

However, how can I get the correct values of the polynomial coefficients in the result of the linear regression with both engineered and scaled features?

Thank you.

Sorry, I don’t totally understand your question.
You have highlighted some values in the image you posted. Why?

The true coefficient were 1 (for x^0), 2 (for x^1), 3 (for x^2) and 4 (for x^3).
However, the regression returned the highlighted coefficients.
Why this is happening?

The weights apply to the normalized polynomial features. Not the raw polynomial features.

Good point!
However to use the model on practice I need to know the true coefficients for real features before normalization, right?

Not necessary.

To use the weights to make new predictions, you first apply the same normalization to the new features.

Yes, I agree.
Now how do I store and re-use exactly the same normalization that was used during training stage?
One way I see in computing, depending on normalization type, the normalization parameters (mean, variance, max, min) and store them somewhere. Right?

1 Like

In this example, the values you would need are not being provided by the zscore_normalize_features() function.

For example, if you normalized by subtracting the mean and dividing by the standard deviation, you would need to save the values for the mean and standard deviation, so you could apply those to the new data before you use the model to make a prediction.

Exactly!

My idea was getting the real polynomial coefficients so one could plug the original feature values without any modification or normalization straight to the model.
Does this way make sense?

It’s an alternate method. With a bit of algebra you can probably figure out how to apply the normalization to the learned weights.