Wk 2, ResNet prog. assign: What's the diff. between X = Activation('relu')(X) and tf.keras.layers.ReLU()(X)?

When I initially tried X=tf.keras.layers.ReLU()(X) in the “identity_block” function it passed all the tests but turned out to be a problem for Ex. 3 ResNet50 function. It kept returning an error when running the test for the ResNet50. As soon as I went back to the Identity block function and changed it to X = Activation(‘relu’)(X) it worked perfect. Where is the catch? Thanks ahead!.

The two are functionally equivalent. But you’ll notice in the test cell for the ResNet function, they do effectively a “string compare” with the model that they expect. The strings you get for the two are different.

Speaking of “noticing things”, have a more careful look at the instructions for the identity block and the conv block. They are very explicit about which form of ReLU you are supposed to use. You may think you are saving yourself time by breezing through the instructions, but was that a net savings of time in this case? He asked rhetorically :nerd_face:

I see what you mean. I don’t like to blindly follow instructions. I have to understand why. I didn’t check the test cell. Thanks @paulinpaloalto for drawing my attention to that!:+1:

But the failure output from the test should have been pretty clear, right? It’s a compare between the two different string representations, even if you didn’t actually examine how they implemented the test.