How the original layer object's trainable property is updated

In the Course4 W2 assignment2 Transfer Learning with MobileNetV2, Exercise 3

can anyone help me understand how/when the original model2.layer[4] trainable property is updated? It looks to me the another variable call base_model property is being updated.

a similar case happens in the for loop as well. I am also confused.

base_model = model2.layers[4]
base_model.trainable = True
# Freeze all the layers before the `fine_tune_at` layer
for layer in base_model.layers[:fine_tune_at]:
    layer.trainable = False

In python, multiple assignment statements to a list refer to the same underlying list.

my_list = [1,2,3]
also_mine = my_list
also_mine.append(4)
print(my_list)

Output: [1, 2, 3, 4]

model.layers returns a list where each element is a reference a model layer (i.e. not a deep copy of the model layer.)

Here’s an example:

import tensorflow as tf
model = tf.keras.Sequential([
    tf.keras.layers.Dense(10, input_shape=[1]),
    tf.keras.layers.Dense(1),
])
print(model.summary())

The number of trainable parameters is 31

Model: "sequential_1"
_________________________________________________________________
 Layer (type)                Output Shape              Param #   
=================================================================
 dense_2 (Dense)             (None, 10)                20        
                                                                 
 dense_3 (Dense)             (None, 1)                 11        
                                                                 
=================================================================
Total params: 31
Trainable params: 31
Non-trainable params: 0
_________________________________________________________________

After marking the last layer as non-trainable, the number of parameters goes down to 20

model.layers[-1].trainable = False
print(model.summary())

Output:

Model: "sequential_1"
_________________________________________________________________
 Layer (type)                Output Shape              Param #   
=================================================================
 dense_2 (Dense)             (None, 10)                20        
                                                                 
 dense_3 (Dense)             (None, 1)                 11        
                                                                 
=================================================================
Total params: 31
Trainable params: 20
Non-trainable params: 11
_________________________________________________________________

Does your code make sense now?

1 Like

thank you! this makes sense now.