Alpaca Model implementation

I am strugling to implement the Alpaca Model. I am getting an error that my model has only 7 layers not 8 layers. When i compare with test model looks like i don´t have layer named as “Sequential” layer.

Any tips what´s is wrong with my code ?

follow my code, Mymodel summary and error message.

Any tips are welcome.

Thanks in advance.

{moderator edit - solution code removed}

model2 = alpaca_model(IMG_SIZE, data_augmentation)
model2.summary()

inputs: (None, 160, 160, 3) data augmentation: <tensorflow.python.keras.engine.sequential.Sequential object at 0x7f3b0c43f550> Model: “functional_3” _________________________________________________________________ Layer (type) Output Shape Param # ================================================================= input_5 (InputLayer) [(None, 160, 160, 3)] 0 _________________________________________________________________ tf_op_layer_RealDiv_1 (Tenso [(None, 160, 160, 3)] 0 _________________________________________________________________ tf_op_layer_Sub_1 (TensorFlo [(None, 160, 160, 3)] 0 _________________________________________________________________ mobilenetv2_1.00_160 (Functi (None, 5, 5, 1280) 2257984 _________________________________________________________________ global_average_pooling2d_1 ( (None, 1280) 0 _________________________________________________________________ dropout_1 (Dropout) (None, 1280) 0 _________________________________________________________________ dense_1 (Dense) (None, 1) 1281 ================================================================= Total params: 2,259,265 Trainable params: 1,281 Non-trainable params: 2,257,984

from test_utils import summary, comparator

alpaca_summary = [[‘InputLayer’, [(None, 160, 160, 3)], 0],
[‘Sequential’, (None, 160, 160, 3), 0],
[‘TensorFlowOpLayer’, [(None, 160, 160, 3)], 0],
[‘TensorFlowOpLayer’, [(None, 160, 160, 3)], 0],
[‘Functional’, (None, 5, 5, 1280), 2257984],
[‘GlobalAveragePooling2D’, (None, 1280), 0],
[‘Dropout’, (None, 1280), 0, 0.2],
[‘Dense’, (None, 1), 1281, ‘linear’]] #linear is the default activation

comparator(summary(model2), alpaca_summary)

for layer in summary(model2):
print(layer)

AssertionError Traceback (most recent call last) in 10 [‘Dense’, (None, 1), 1281, ‘linear’]] #linear is the default activation 11—> 12comparator(summary(model2), alpaca_summary) 13 14for layer in summary(model2):~/work/W2A2/test_utils.py in comparator(learner, instructor) 14def comparator(learner, instructor): 15if len(learner)!= len(instructor):—> 16raise AssertionError(f"The number of layers in the model is incorrect. Expected: {len(instructor)} Found: {len(learner)}") 17for a, b in zip(learner, instructor): 18if tuple(a)!= tuple(b):

AssertionError: The number of layers in the model is incorrect. Expected: 8 Found: 7

On first look, I feel like you do not use this x variable ( that is a function object) anywhere.
x gets reassigned in the next line.

x = data_augmenter()
# print ("data augmentation:",x) 

Yes, that’s part of the problem. It’s also a mistake to directly call data_augmenter there. That is a function that returns a function, so the x that you are discarding ends up being a function, not a tensor. You’re supposed to call the function that was actually passed in which is called data_augmentation, right? And you need to pass it a tensor as an argument, since it’s a sequential layer.

1 Like

yes. Understood. Direct call to “data_aumenter” returns a function not a tensor. I have to call data_augmentation insted and feed “x” in the next call “preprocess_input”. I print out after “data_aumentation” and “preprocess_input” get Tensors.
Thanks !

inputs.shape (None, 160, 160, 3)
type(x)=<class ‘tensorflow.python.framework.ops.Tensor’>
type(x)=<class ‘tensorflow.python.framework.ops.Tensor’>