Q: Should I use the Glorot Uniform Initializer or Xavier? A: Yes

From time to time in this class material, the forum threads, or in papers out on the interweb you may see references to both Glorot Uniform Initializer and Xavier Initializer and wonder, ‘Which one should I use?’ Luckily, the answer is simple, ‘Yes’. Because they are the same thing :sunglasses:

See for example: tf.keras.initializers.GlorotUniform  |  TensorFlow Core v2.8.0

The Glorot uniform initializer, also called Xavier uniform initializer.

And the reason for the ambiguity is that Xavier and Glorot are one person: https://www.aminer.org/profile/xavier-glorot/53f459b6dabfaee02ad655fe

1 Like

Excellent! Thanks for sharing this info.

Indeed they are the same person, but beware that there are two Glorot initializers:

The uniform initializer, used in the original paper “Understanding the difficulty of training deep feedforward neural networks”, 2010-01, by Xavier Glorot and Yoshua Bengio. In Keras:

tf.keras.initializers.GlorotUniform

The normal initializer used in the course. In Keras:

tf.keras.initializers.GlorotNormal