How to use BatchNormalization in tf 2.18?

I am trying to run the W2A1 Residual_Network exercise with tf 2.18. The
tf.keras.backend.set_learning_phase(False) is deprecated and the X = BatchNormalization(axis = 3)(X) required a calling argument: training=learning_phase. What is the correct way to create the identity_block in this case? Should I just add an extra training parameter to the identify_block for learning or inference?

I think I figure out. The BatchNormalization here is not the layer implemented in the tf.keras.layers.BatchNormalization, it’s actually a custom implemented in the resnet_utils.py. I replaced with the tf.keras.layers.BatchNormalization then add the training=training passed in. It works.

2 Likes

Outside of the specialization, I doubt that anyone would use this custom layer due to licensing issues.

As far as tensorflow 2.18 is concerned, the snippet works just as well:

>>> import tensorflow as tf
>>> tf.__version__
'2.18.0'
>>> dummy = tf.random.normal((32, 32, 3))
>>> layer = tf.keras.layers.BatchNormalization()
>>> out = layer(dummy, training=False)
>>> out.shape
TensorShape([32, 32, 3])
1 Like