Hello dear team,
i’m stuck in the last part of the fourth code part (Exercise 1 - happyModel).
## Dense layer with 1 unit for output & 'sigmoid' activation
After figuring out that “tf.keras.layers.” is a syntax prerequisite (compared to the old non-keras version) I used my old code with the parameters “(1, activation=‘sigmoid’, name=‘fc’)”.
But the error message is as follows:
[‘ZeroPadding2D’, (None, 70, 70, 3), 0, ((3, 3), (3, 3))]
[‘Conv2D’, (None, 64, 64, 32), 4736, ‘valid’, ‘linear’, ‘GlorotUniform’]
[‘BatchNormalization’, (None, 64, 64, 32), 128]
[‘Activation’, (None, 64, 64, 32), 0]
[‘MaxPooling2D’, (None, 32, 32, 32), 0, (2, 2), (2, 2), ‘valid’]
[‘Flatten’, (None, 32768), 0]
[‘Dense’, (None, 1), 32769, ‘relu’]
Test failedExpected value
[‘ReLU’, (None, 64, 64, 32), 0]
does not match the input value:
[‘Activation’, (None, 64, 64, 32), 0]
Any hints for me?
Thank you for looking into this.
Btw: if a mentor could add some information to the assignment regarding syntax, especially that “tf.keras.layers.Input(shape=(64,64,3)),” is crucial for coding, it would probably help a lot of students and save a lot of time (also yours)…