Feature scaling vs regularization

Hello!

While reviewing regularization I had these questions: is it still needed to perform feature scaling when I am using regularization? Are they both mutually exclusive? Or are there cases that I need to use both? If yes, which cases?

I would really appreciate if someone could help me with those questions.

Thank you!

They’re separate topics.

Normalizing / feature scaling helps gradient descent work better, because all of the gradients will have similar magnitudes.

Regularization is a method for preventing overfitting the training set.

1 Like

Thank you for the answer!