Transfer_learning_with_MobileNet_v1 alpaca_model

Hi. I’m having an issue with the alpaca_model where I’m getting the error:
The number of layers in the model is incorrect. Expected: 8 Found: 7

I’m not sure if I’m adding the end layers and binary classification correctly?

add the new Binary classification layers

# use global avg pooling to summarize the info in each channel
x = tf.keras.layers.GlobalAveragePooling2D()(x)
# include dropout with probability of 0.2 to avoid overfitting
x = tf.keras.layers.Dropout(.2)(x)
    
# use a prediction layer with one neuron (as a binary classifier only needs one)
outputs = tf.keras.layers.Dense(1)(x)

I didn’t see a specific binary classification layer in the keras.layers documentation and I’m not sure sure if I need to use the tf.keras.layers.Add() function, but I tried that and it still gave me the same error.
These are the layers I see in the model2:

[‘InputLayer’, [(None, 160, 160, 3)], 0]
[‘TensorFlowOpLayer’, [(None, 160, 160, 3)], 0]
[‘TensorFlowOpLayer’, [(None, 160, 160, 3)], 0]
[‘Functional’, (None, 5, 5, 1280), 2257984]
[‘GlobalAveragePooling2D’, (None, 1280), 0]
[‘Dropout’, (None, 1280), 0, 0.2]
[‘Dense’, (None, 1), 1281, ‘linear’]

I think it’s looking to match the following alpaca_summary:
[[‘InputLayer’, [(None, 160, 160, 3)], 0],
[‘Sequential’, (None, 160, 160, 3), 0],
[‘TensorFlowOpLayer’, [(None, 160, 160, 3)], 0],
[‘TensorFlowOpLayer’, [(None, 160, 160, 3)], 0],
[‘Functional’, (None, 5, 5, 1280), 2257984],
[‘GlobalAveragePooling2D’, (None, 1280), 0],
[‘Dropout’, (None, 1280), 0, 0.2],
[‘Dense’, (None, 1), 1281, ‘linear’]] #linear is the default activation

so it’s missing the sequential layer? I didn’t see in the assignment steps where the sequential layer should have been added?

The sequential layer is the data augmentation function. That function was passed as an argument to the model function and then you need to invoke it appropriately.

BTW you filed this under DLS Course 2, but I moved it to DLS Course 4 using the little “edit pencil” on the title.

Thanks I just figured this out and it’s working now.

Hi I get the following instead


I don’t see the sequential as well, even though I Initialized it in the data_augmentation function
Here’s a snippet of my code:

{moderator edit - solution code removed}

Your data_augmentation should be a function call. I think in my assignment the function was named data_augmenter() and you have to pass the inputs

But data_augmentation is a function, right? So you need to pass it an input. But be careful to use the name of the parameter that was actually passed into the model. The way you wrote the code x is a function after the data augmentation step and then you overwrite it on the next step, which is why it does not show up in the compute graph.

So the overall point is that a bit more careful thought is required here. :nerd_face:

Managed to solve it! Thanks

Thanks, I had the same issue with 7 layers instead of 8, it turns out I didn’t invoke the data_augmentation correctly, and also was applying the preprocess_input function to the wrong variable.