Hi everyone, I am on week three of this course and I just got through this section on regularization. I was wondering if anybody could clarify for me how this is conceptually different from feature scaling?
Feature scaling adjusts the values of the features so that gradient descent works better (fewer iterations, larger learning rate, without risking divergence).
Regularization helps prevent overfitting the training set. It’s a model optimization for making good predictions.
2 Likes