Residual networks, exercise 3, expecting Activation layer but ReLU one is found

Hi everyone,
when building the ResNet, I get the following error:


It seems that the problem is the ReLU activation layer. The grader wants me to use an Activation layer, but I do not know what it means (ReLU is an activation layer).
Thanks in advance for your help :slight_smile:

Hey, you should check the instructions right before the identity_block and convolutional_block functions, you will find how to use the Activation layer.

By the way Activation is a layer in keras which can be used to call different activation functions like, relu, softmax, sigmoid by just passing them as a string.

Since the exercise is limited by the tester function, you’ll need to follow the instruction in this case or otherwise layer names won’t match :stuck_out_tongue:, but yeah your solution is right when you directly use the ReLU activation as well.

Thank you for your help!
The problem was that that I used ReLU() instead of Activation(‘relu’) in the identity function… Very subtle :slight_smile:

Thank you!!!

1 Like