C4W2A2 Transfer Learning model summary

In Exercise 2 - alpaca_model, Iget the foloowing model summay:

Model: “functional_6”
*Layer (type) Output Shape Param # *
*input_25 (InputLayer) [(None, 160, 160, 3)] 0 *
*tf_op_layer_RealDiv_6 (Tenso [(None, 160, 160, 3)] 0 *
*tf_op_layer_Sub_6 (TensorFlo [(None, 160, 160, 3)] 0 *
*mobilenetv2_1.00_160 (Functi (None, 5, 5, 1280) 2257984 *
*global_average_pooling2d_6 ( (None, 1280) 0 *
*dropout_3 (Dropout) (None, 1280) 0 *
*dense_3 (Dense) (None, 1) 1281 *

From the following check of my code I get:
AssertionError: The number of layers in the model is incorrect. Expected: 8 Found: 7

As you can see from the alpaca_summary I am missing the “Sequential Layer
I included it the function data_augmenter, e.g. Exercise 1

So now I do not understand how I get the Sequential Layer printed out in the model summary.

When I add x = tf.keras.layers.Sequential() before or after the step x = data_argumentation there is no change in the output

You don’t need to do this. Please check your DM for further instructions.

To update others, there were several mistakes @Peter_Grabner was doing, like not giving the input argument to the augmentation and using the wrong data to preprocess. I think reading the instructions again (after writing your code) would help you out.

So I have got one more problem I do not understand in Ex3:
Here is my code:

base_model = model2.layers[4]
base_model.trainable = True
# Let’s take a look to see how many layers are in the base model
print("Number of layers in the base model: ", len(base_model.layers))

# Fine-tune from this layer onwards
fine_tune_at = 120


# Freeze all the layers before the fine_tune_at layer
for layer in base_model.layers[:fine_tune_at]:

  • layer.trainable = False*

# Define a BinaryCrossentropy loss function. Use from_logits=True
loss_function = tf.keras.losses.BinaryCrossentropy(from_logits = True)
# Define an Adam optimizer with a learning rate of 0.1 * base_learning_rate
optimizer = tf.keras.optimizers.Adam(learning_rate = (0.1 * base_learning_rate))
# Use accuracy as evaluation metric
# metrics = tf.keras.metrics.Accuracy()
metrics = ‘accuracy’
#print(metrics[0]) ## is “a” of course

model2.compile(loss = loss_function,

  •           optimizer = optimizer,*
  •           metrics = metrics)*

In the grader cell there is

assert type(loss_function) == tf.python.keras.losses.BinaryCrossentropy, “Not the correct layer”
assert loss_function.from_logits, “Use from_logits=True”
assert type(optimizer) == tf.keras.optimizers.Adam, “This is not an Adam optimizer”
assert optimizer.lr == base_learning_rate / 10, “Wrong learning rate”
assert metrics[0] == ‘accuracy’, “Wrong metric”

print(‘\033[92mAll tests passed!’)

But assert metrics[0] gives you the first char of the String “accuracy” wich is “a

Have you tried reading the TF documentation or searching the forums for “metrics”? There are several ways to specify accuracy as the metric, but your syntax is not one of them. :grinning: Just as a hint, here’s one sentence I found on the page I linked above:

The metrics argument should be a list -- your model can have any number of metrics.

The way you specified it is a string. I guess you could say that a string is a “list of characters”, but that’s not what they mean here. What they meant is a list of strings or a list of references to actual instantiated functions.

Okay thanks I misunderstood that in the documentation.
So it is working now