Convolutional_Model

Deep Learning Course 4 - Week1
2 Part issue:

  1. If I have tfl.ReLU()(Z1), I get the error that ReLu is not an attribute of TFL
  2. If I then change to tfl.Activate(‘relu’)(Z1) , I get output. However, it does not match the expected output, as shown below.

Model: “functional_3”


Layer (type) Output Shape Param #

input_5 (InputLayer) [(None, 64, 64, 3)] 0


conv2d_7 (Conv2D) (None, 64, 64, 8) 392


activation_3 (Activation) (None, 64, 64, 8) 0


max_pooling2d_4 (MaxPooling2 (None, 8, 8, 8) 0


conv2d_8 (Conv2D) (None, 8, 8, 16) 528


activation_4 (Activation) (None, 8, 8, 16) 0


max_pooling2d_5 (MaxPooling2 (None, 2, 2, 16) 0


flatten_2 (Flatten) (None, 64) 0


dense_2 (Dense) (None, 6) 390

Total params: 1,310
Trainable params: 1,310
Non-trainable params: 0


Test failed
Expected value

[‘ReLU’, (None, 64, 64, 8), 0]

does not match the input value:

[‘Activation’, (None, 64, 64, 8), 0]

AssertionError Traceback (most recent call last)
in
15 [‘Dense’, (None, 6), 390, ‘softmax’]]
16
—> 17 comparator(summary(conv_model), output)

~/work/release/W1A2/test_utils.py in comparator(learner, instructor)
20 “\n\n does not match the input value: \n\n”,
21 colored(f"{a}", “red”))
—> 22 raise AssertionError(“Error in test”)
23 print(colored(“All tests passed!”, “green”))
24

AssertionError: Error in test

Hi @preston7063, and welcome to Discourse. The following snippet:

layer = tfl.ReLU()
output = layer([-3.0, -1.0, 0.0, 2.0])
list(output.numpy())
[0.0, 0.0, 0.0, 2.0]

works in the assignment notebook. It was taken from the doc of tf.keras.layers.ReLU (tf.keras.layers.ReLU  |  TensorFlow Core v2.4.1), abbreviated as tfl.Relu in the notebook. I don’t see the code you have, but this shows that ReLU is part of tfl, after you import tensorflow.keras.layers as tfl. Can you try maybe to use that full package name: tensorflow.keras.layers.ReLU, instead of tfl, and see what you get?