I don’t understand exactly what is the batch norm layer in the model structure.
Hi, @Nicolas_Laverde !
The main purpose of batch normalization is the standardization of the layer inputs, which is basically trying to make the input probability distribution mean = 0 and std dev = 1. To accomplish that, the batch normalization layer will learn the parameters \beta and \gamma during training.
Freezing the batch norm layer means that the parameters \beta and \gamma are not being updated so they retain the original values.
3 Likes