C2_W2 Exercise 2

When running unit test for exercise 2, I am getting this error. I am unsure of what it means or how to fix it. Any help would be much appreciated.

f"Wrong number of units in layer {i}. Expected {expected[i][1]} but got {layer.output.shape.as_list()}"
assert layer.activation == expected[i][2],
f"Wrong activation in layer {i}. Expected {expected[i][2]} but got {layer.activation}"
i = i + 1

AssertionError: Wrong activation in layer 0. Expected <function relu at 0x7f01fce14950> but got <keras.layers.advanced_activations.ReLU object at 0x7f01d00cbd10>

I suspect you used a separate Activation layer, rather than just specifying
activation='relu'
as a parameter.

I have my layer network as follows:
tf.keras.layers.Input(shape=(400,)),
Dense(25, ‘ReLU’, name = ‘Layer1’),
Dense(15, ‘ReLU’, name = ‘Layer2’),
Dense(10, ‘linear’, name = ‘Layer3’),

The error only occurs when running “test_model(model, 10, 400)”

I fixed the error by changing the spelling from ‘ReLU’ to ‘relu’

1 Like