Fitting Batch Norm


Wouldn’t the Hyperparameter Beta we use in the Batch Normalization clash with the parameter we were using for the Optimization algorithms previously?

Can you please explain what exactly your query is sir?

Sure! Is there a convention that we need to follow there.

Let’s assume I am tuning both Beta for GD with Momentum and also Beta for the Batch Normalization wouldn’t naming these two variables as Beta clash?

I understand they are just variables and we can name them anything but I was wondering if there is a convention to distinguish between them.

I get you sir but i dont believe there is one. If you are doing that in the same code you might give one’s name as Beta1 and other’s as Beta2
So that in a code it won’t affect but I would recommend not doing that
You can make 2 separate functions for those 2 tasks and then you can reuse the variable name in 2 different methods
If you are writing one long code( which I dont recommend) you can do something like Beta1 and Beta2
Does that help sir
Thank you