Week 1 Assignment 2 Convolution Model

Hello everyone,
I am having this error but everything seem fine

[‘ZeroPadding2D’, (None, 70, 70, 3), 0, ((3, 3), (3, 3))]
[‘Conv2D’, (None, 64, 64, 32), 4736, ‘valid’, ‘linear’, ‘GlorotUniform’]
[‘BatchNormalization’, (None, 64, 64, 32), 128]
[‘Activation’, (None, 64, 64, 32), 0]
[‘MaxPooling2D’, (None, 32, 32, 32), 0, (2, 2), (2, 2), ‘valid’]
[‘Flatten’, (None, 32768), 0]
[‘Dense’, (None, 1), 32769, ‘sigmoid’]
Test failed
Expected value
[‘ReLU’, (None, 64, 64, 32), 0]
does not match the input value:
[‘Activation’, (None, 64, 64, 32), 0]

Can anyone help me i don’t know the activation function is according to the documentation.

Did you use the ReLU() layer, or did you use an Activation layer with a ReLU argument?

tfl.Activation(‘relu’)

That’s the problem. Use the ReLU() layer.

I already rectified the error, but thank you.

I had the same issue, fixed with tfl.ReLU(). However I don’t understand what’s the difference here. Why does one work and the other doesn’t? What’s the difference between these two?

Functionally they are probably the same.

The grader is looking for a relu layer.