Course 4 Week 1 Assignment 2, happyModel - Dense Function

Hello dear team,

i’m stuck in the last part of the fourth code part (Exercise 1 - happyModel).

## Dense layer with 1 unit for output & 'sigmoid' activation

After figuring out that “tf.keras.layers.” is a syntax prerequisite (compared to the old non-keras version) I used my old code with the parameters “(1, activation=‘sigmoid’, name=‘fc’)”.

But the error message is as follows:

[‘ZeroPadding2D’, (None, 70, 70, 3), 0, ((3, 3), (3, 3))]
[‘Conv2D’, (None, 64, 64, 32), 4736, ‘valid’, ‘linear’, ‘GlorotUniform’]
[‘BatchNormalization’, (None, 64, 64, 32), 128]
[‘Activation’, (None, 64, 64, 32), 0]
[‘MaxPooling2D’, (None, 32, 32, 32), 0, (2, 2), (2, 2), ‘valid’]
[‘Flatten’, (None, 32768), 0]
[‘Dense’, (None, 1), 32769, ‘relu’]
Test failed

Expected value

[‘ReLU’, (None, 64, 64, 32), 0]

does not match the input value:

[‘Activation’, (None, 64, 64, 32), 0]

Any hints for me?
Thank you for looking into this.

Btw: if a mentor could add some information to the assignment regarding syntax, especially that “tf.keras.layers.Input(shape=(64,64,3)),” is crucial for coding, it would probably help a lot of students and save a lot of time (also yours)…

Hi,
I think you used “tf.keras.layers.Activation(‘relu’)” instead of “tf.keras.layers.ReLU()” for your 4th layer.
Even though I think they should do the same thing technically, it’s just a test condition that’s written like so…

btw I think for the last dense layer you should go with sigmoid activiation instead of relu as well

1 Like

Thanks for answering Dyxuki. It worked and all tests passed.

It seems that
“tensorflow.keras.layers.ReLu()”
differs from
“tf.keras.layers.Activation(‘relu’)” for this specific test.

The documentation doesn’t give me insight whether the activation function is used for all further layers after defining it. So, how can I specify the activation to sigmoid in general - when not used as an attribute within the dense layer? (which I did use as follows)

(1, activation='sigmoid', name='fc')

Is the use of “tensorflow.keras.layers.sigmoid()” before the dense layer correct? Or is “tf.keras.activations.sigmoid()” the appropriate one?

Hi,
I found that there are a lots of redundant API functions in Tensorflow Keras. I believe “tensorflow.keras.layers.ReLu()” and “tf.keras.layers.Activation(‘relu’)” should do very similar thing if not identical, and they might even be wired to the same backend actually.
When you add an activation layer, it will be applied on the layer before it.
So in this sense, the activation function should alway be after the dense layer, if you want it to be applied on the dense layer.
By default, the dense layer comes with the linear activation, which means basically no activation. If you recall what you saw in the courses, we noted it z.
Adding an activation after that layer namely specifies the function g (course’s notation), so at the end you’ll have a=g(z).

So coming back to your question, I believe you’ll have to specify the activation for each layer (if needed) individually.
For the dense layer, you have the choice:
either using “tf.keras.layers.Dense(units, activation=“sigmoid”)” directly in the dense layer, or: tf.keras.layers.Dense(units) AND tf.keras.activations.sigmoid() after it.

btw, it seems that “tensorflow.keras.layers.sigmoid()” doesn’t exist, the API is highly unconsistent :’)

Hope this helps