Out of curiosity, I want to know if tensorflow uses gradient descent or newton’s method for the back propagation algorithm. Thanks
Hi @Emekadavid great question, TensorFlow allows specifying the optimization algorithm:
model.compile(optimizer=tf.keras.optimizers.SGD(learning_rate=0.01),
loss=tf.keras.losses.CategoricalCrossentropy(),
metrics=['accuracy'])
Here we are using Stochastic Gradient Descent, but I think TensorFlow do not support Newton’s method.
Thanks for the information. Appreciated
1 Like