Hey @QUY_LE_MINH,
We are glad that you are enjoying the assignments. The answer to your question lies hidden in the assignment only. Find out the below lines of code in your assignment:
for layer in summary(model2):
print(layer)
When you would have run this code, you might have received some output along the following lines:
['InputLayer', [(None, 160, 160, 3)], 0]
['Sequential', (None, 160, 160, 3), 0]
['TFOpLambda', (None, 160, 160, 3), 0]
['TFOpLambda', (None, 160, 160, 3), 0]
['Functional', (None, 5, 5, 1280), 2257984]
['GlobalAveragePooling2D', (None, 1280), 0]
['Dropout', (None, 1280), 0, 0.2]
['Dense', (None, 1), 1281, 'linear']
Now, from this output, we can clearly see that the first layer is for the Inputs, the following 3 layers are for the Data Augmentation & Pre-processing. It is the 5th layer, which consists of a Functional Model, which is nothing but our base_model
. And this is the sub-model (you can say just for our reference), in which we want to freeze the layers, hence the 2 lines of code. I hope this resolves your query.
Cheers,
Elemento