Tf.functions redundant?

I am going through the TensorFlow programming assignment, and I can’t understand the reason for the existence of the multiple functions that seemingly do the same thing? What’s the difference?

tf.keras.activations.sigmoid() vs tf.sigmoid()
tf.math.add() vs tf.add()
tf.linalg.matmul() vs tf.matmul()

In the case of add and matmul those are alias of the same function used for convenience.

In the case of sigmoid looks like those are different functions, but underneath it seems they use the same operation. I guess this is due to the fact that Keras was an independent library until it was incorporated to Tensorflow in version 2.0

3 Likes