[Week 3] Exercise 4 - initialize_parameters

If I initialize b using tf.zeros, I get this error:

Use the GlorotNormal initializer

if I initialize b using tf.keras.initializers.GlorotNormal, I pass the testcases. However, I though b should be initialized with tf.zeros

Am I missing something?

1 Like

Hi caominhvu,

You don’t have to initialize the bias to zeros, it is just convenient and easier because it tends to not affect gradient convergence. There is some literature that suggests initializing bias with non-zero values, such as GlorotNormal in this case, can improve convergence times.

In any case, bias can be initialized with zeros or small values. I expect using GlorotNormal is just to minimize confusion and have an impl standard.

(I found this post interesting: https://becominghuman.ai/priming-neural-networks-with-an-appropriate-initializer-7b163990ead)

Hope that helps!

6 Likes

Thank you @neurogeek , the link is very interesting.