[Week 1] tf.keras.layers.ReLU () vs tf.keras.layers.Activation(activation = 'relu')')

I would to know what is the difference between using tf.keras.layers.Activation(activation = ‘relu’)’)
(tf.keras.layers.Activation  |  TensorFlow Core v2.4.1)
and tf.keras.layers.ReLU ()
I tried to read the documentation form both, but is not really clear for me when should I use which.
The only difference I could spot was from the output from the automated grader
(from this line of code : comparator(summary(conv_model), output))

[‘Activation’, instead of ‘ReLU’]

and the grader only accepts tf.keras.layers.ReLU ().

Does anyone know why?

Many thanks in advance for your time and for your help! :slight_smile:

1 Like

Hey @garrofederico!
I don’t think there is a difference between those two (you can specify a negative slope and threshold in tf.keras.layers.ReLU(), but it doesn’t factor in in this context).

For that test you’re failing with tf.keras.layers.Activation() we compare the summaries of the models, which have the ReLU and Activation layers be different in it, hence the error. Since the Assignment expect you to use the tf.keras.ReLU() (as shown by how the documentation was linked in the notebook), I’d advice just using that here.

And no worries, they can both be completely equivalent. Hopefully this cleared it out for you :slight_smile:


Ok, so you confirmed what I suspected…

Many thanks for your help!!