Course 2 week 3 quiz

Hello,
I had finished leaning the batch norm and when i am doing the quiz, i am confused with the following qusetion:


I want to know why this is false?(That is why \beta and \gama are not the hyperparameters)
Thanks for your support in advance!

Adam

What’s more, Are \beta and \gama different by different layers?
(That is, for z^{[1]} and z^{[2]}, for example, they use the same \beta and \gama to do Batch Norm?)

Hi @AdamWang,

In batch normalization the beta and gamma are trainable parameters. Those are added to the network trainable set of parameters (weights and biases). They cannot be considered hyper-parameters which are constant during training

1 Like

Hi,

I have a further question on the same question.

Isn’t option “There is one global…for each layer, and applies to all the hidden units in that layer” correct?

In the “Adding Batch Norm to a Network” it shows Beta and Gamma are indexed by layers. Does it mean that we have a different Gamma, Beta for each hidden unit while the mean and std.dev. apply to all of them?

Thanks