How to calculate the number of parameters in a BatchNorm-Layer?

Hi Andreas,

The BatchNorm-Layer only performs a normalization operation. The output shape is the same as the input shape. See tf.keras.layers.BatchNormalization | TensorFlow Core v2.4.1

Hi Mentor, thank you very much for answering

I know it’ s doing normalization (and there 's a gamma and a beta…)

But here the pasted model.summary says there are 128 parameters in the batch norm-layer …

I am still wondering…

Greetings

Andreas

zero_padding2d_4 (ZeroPaddin (None, 70, 70, 3) 0

conv0 (Conv2D) (None, 64, 64, 32) 4736

bn0 (BatchNormalization) (None, 64, 64, 32) 128

activation_4 (Activation) (None, 64, 64, 32) 0

Hi Andreas,

They are the gamma and beta weights, as well as the mean and std (see, e.g., keras - How the number of parameters associated with BatchNormalization layer is 2048? - Stack Overflow and keras - How to set weights of the batch normalization layer? - Stack Overflow).

The number of each parameter is equal to the number of features, which in this case is 32. So you get 4*32 = 128 parameters.

If you want to dive deeper into this you can have a look at this paper:

jmlr.org/proceedings/papers/v37/ioffe15.pdf

I guess I got it

Thank you very much