Course 4 Week1 Assigment2!

I’ve trying define the model

{moderator edit - solution code removed}

# Print a summary for each layer
for layer in summary(happy_model):
    print(layer)
    
output = [['ZeroPadding2D', (None, 70, 70, 3), 0, ((3, 3), (3, 3))],
            ['Conv2D', (None, 64, 64, 32), 4736, 'valid', 'linear', 'GlorotUniform'],
            ['BatchNormalization', (None, 64, 64, 32), 128],
            ['ReLU', (None, 64, 64, 32), 0],
            ['MaxPooling2D', (None, 32, 32, 32), 0, (2, 2), (2, 2), 'valid'],
            ['Flatten', (None, 32768), 0],
            ['Dense', (None, 1), 32769, 'sigmoid']]
    
comparator(summary(happy_model), output)

But I got an error:

['ZeroPadding2D', (None, 70, 70, 3), 0, ((3, 3), (3, 3))]
['Conv2D', (None, 64, 64, 32), 4736, 'valid', 'linear', 'GlorotUniform']
['BatchNormalization', (None, 64, 64, 32), 128]
['Activation', (None, 64, 64, 32), 0]
['MaxPooling2D', (None, 32, 32, 32), 0, (2, 2), (2, 2), 'valid']
['Flatten', (None, 32768), 0]
['Dense', (None, 1), 32769, 'sigmoid']
Test failed 
 Expected value 

 ['ReLU', (None, 64, 64, 32), 0] 

 does not match the input value: 

 ['Activation', (None, 64, 64, 32), 0]

---------------------------------------------------------------------------
AssertionError                            Traceback (most recent call last)
<ipython-input-17-f33284fd82fe> in <module>
     12             ['Dense', (None, 1), 32769, 'sigmoid']]
     13 
---> 14 comparator(summary(happy_model), output)

~/work/release/W1A2/test_utils.py in comparator(learner, instructor)
     20                   "\n\n does not match the input value: \n\n",
     21                   colored(f"{a}", "red"))
---> 22             raise AssertionError("Error in test")
     23     print(colored("All tests passed!", "green"))
     24 

AssertionError: Error in test

Please help me!

The error message is telling you that you are specifying the “relu” layers incorrectly. You are using the generic “Activation” layer function and passing it the argument “relu”, but they are asking you to use the explicit layer function for ReLU. They show it to you in the instructions.

Also note that it is not necessary to specify the data_format on the first layer there.

thank you, I’ve done it :+1: