Batch normalization vs regularization

Hi StuHaze,

Welcome!

Thanks for posting this question. It seems interesting. Well yes, batch normalization will seem like an offshoot hack, but they do a proper job too :slight_smile: Here’s a thread that will clear your query to some possible extent.

Regularization is the answer to overfitting. It’s a technique that improves model’s accuracy and also prevents the loss of important data caused due to underfitting. Regularization adds penalties to the parameters and avoids them weigh heavily.