C4W1-Assignment#2 error

For the 2nd programming assignment for Week1. I got “Test failed. Your output is not as expected output” for happyModel(). However, I don’t think it might be comparator or summary functions’ issue. As the difference is between “Activation” and “ReLu” which is not controlled by my code.

Here’s the output of my program:
[‘ZeroPadding2D’, (None, 70, 70, 3), 0, ((3, 3), (3, 3))]
[‘Conv2D’, (None, 64, 64, 32), 4736, ‘valid’, ‘linear’, ‘GlorotUniform’]
[‘BatchNormalization’, (None, 64, 64, 32), 128]
[‘Activation’, (None, 64, 64, 32), 0]
[‘MaxPooling2D’, (None, 32, 32, 32), 0, (2, 2), (2, 2), ‘valid’]
[‘Flatten’, (None, 32768), 0]
[‘Dense’, (None, 1), 32769, ‘sigmoid’]
Test failed. Your output is not as expected output.

Below is expected output:
[‘ZeroPadding2D’, (None, 70, 70, 3), 0, ((3, 3), (3, 3))]
[‘Conv2D’, (None, 64, 64, 32), 4736, ‘valid’, ‘linear’, ‘GlorotUniform’]
[‘BatchNormalization’, (None, 64, 64, 32), 128]
[‘ReLU’, (None, 64, 64, 32), 0]
[‘MaxPooling2D’, (None, 32, 32, 32), 0, (2, 2), (2, 2), ‘valid’]
[‘Flatten’, (None, 32768), 0]
[‘Dense’, (None, 1), 32769, ‘sigmoid’]
All tests passed!

Maybe you are using tf.keras.layers.Activation('relu'). You have to use tf.keras.layers.ReLU as instructed. Both perform the same function, but we are treating ReLU as a layer, not just an activation function. Hence, the model summary names them differently.

2 Likes

Got it, thank you Saifkhanengr