So I have got one more problem I do not understand in Ex3:
Here is my code:
base_model = model2.layers[4]
base_model.trainable = True
# Let’s take a look to see how many layers are in the base model
print("Number of layers in the base model: ", len(base_model.layers))
# Fine-tune from this layer onwards
fine_tune_at = 120
### START CODE HERE
# Freeze all the layers before the fine_tune_at
layer
for layer in base_model.layers[:fine_tune_at]:
- layer.trainable = False*
-
# Define a BinaryCrossentropy loss function. Use from_logits=True
loss_function = tf.keras.losses.BinaryCrossentropy(from_logits = True)
# Define an Adam optimizer with a learning rate of 0.1 * base_learning_rate
optimizer = tf.keras.optimizers.Adam(learning_rate = (0.1 * base_learning_rate))
# Use accuracy as evaluation metric
# metrics = tf.keras.metrics.Accuracy()
metrics = ‘accuracy’
#print(metrics[0]) ## is “a” of course
### END CODE HERE
model2.compile(loss = loss_function,
-
optimizer = optimizer,*
-
metrics = metrics)*
In the grader cell there is
assert type(loss_function) == tf.python.keras.losses.BinaryCrossentropy, “Not the correct layer”
assert loss_function.from_logits, “Use from_logits=True”
assert type(optimizer) == tf.keras.optimizers.Adam, “This is not an Adam optimizer”
assert optimizer.lr == base_learning_rate / 10, “Wrong learning rate”
assert metrics[0] == ‘accuracy’, “Wrong metric”
print(‘\033[92mAll tests passed!’)
But assert metrics[0] gives you the first char of the String “accuracy” wich is “a”