Transfer Learning Alpaca - Prediction Layer

I am receiving an error on this line.

—> 49 outputs = prediction_layer(x)
50 model = tf.keras.Model(inputs, outputs)

The error message reads as follows: ‘tensorflow.python.framework.ops.EagerTensor’ object is not callable. After some searching, I am still unsure of the meaning of this error message or how to fix it.

I think that the error could be with how I created the prediction layer. Here is my code below:
# set training to False to avoid keeping track of statistics in the batch norm layer
x = base_model(image_batch, training=False)

# Add the new Binary classification layers
# use global avg pooling to summarize the info in each channel 
x = tfl.GlobalAveragePooling2D()(x)

#include dropout with probability of 0.2 to avoid overfitting
x = tfl.Dropout(0.2)(x)
# create a prediction layer with one neuron (as a classifier only needs one)
prediction_layer = tfl.Dense(units=1, activation='linear')(x)

Thanks for any advice!


The line in the code (after what you included):
outputs = prediction_layer(x)

applies the layer to the input (x), but you are defining the prediction_layer to be applied to x already.

If you take out the “(x)” from the end of where you are creating “prediction_layer”, it should work.

1 Like

Thanks! That makes sense.

The code blocks run except for the test block

rom test_utils import summary, comparator

alpaca_summary = [[‘InputLayer’, [(None, 160, 160, 3)], 0],
[‘Sequential’, (None, 160, 160, 3), 0],
[‘TensorFlowOpLayer’, [(None, 160, 160, 3)], 0],
[‘TensorFlowOpLayer’, [(None, 160, 160, 3)], 0],
[‘Functional’, (None, 5, 5, 1280), 2257984],
[‘GlobalAveragePooling2D’, (None, 1280), 0],
[‘Dropout’, (None, 1280), 0, 0.2],
[‘Dense’, (None, 1), 1281, ‘linear’]] #linear is the default activation

comparator(summary(model2), alpaca_summary)

for layer in summary(model2):

Test failed
Expected value

[‘Functional’, (None, 5, 5, 1280), 2257984]

does not match the input value:

[‘GlobalAveragePooling2D’, (None, 3), 0]

AssertionError: Error in test

So I guess there is an error in the test block? I accidentally modified it?

1 Like

Yes, you can see that the dimensions for the GlobalAveragePooling2D doesn’t match.

I would check if you are correctly creating the preprocess_input and setting training of the base_model to False

I already set = False,

I think you are right about the preprocessing being wrong,

inputs = tf.keras.Input(shape=input_shape)

# apply data augmentation to the inputs
x = data_augmentation(inputs)

# data preprocessing using the same weights the model was trained on
x = preprocess_input(x)

I am really lost as to what could be wrong with this preprocessing. Thanks for your quick responses laivict.

From this line in your first post:

You are setting training to False for “image_batch”, but in the earlier line, you are preprocessing input and storing the result in “x”.

It should be on x instead

1 Like

@laivict2 Can you help me with the same issue.
base_model.trainable = False
inputs = tf.keras.Input(shape=input_shape)
x = data_augmentation(inputs)
x = preprocess_input(x)
x = base_model(x, training=False)
x = tf.keras.layers.GlobalAveragePooling2D()(x)
x = tf.keras.layers.Dropout(0.2)(x)

prediction_layer = tf.keras.layers.Dense(1, activation='linear')

Test failed
Expected value

[‘Functional’, (None, 5, 5, 1280), 2257984]

does not match the input value:

[‘Functional’, (None, None, None, 1280), 2257984]

AssertionError Traceback (most recent call last)
10 [‘Dense’, (None, 1), 1281, ‘linear’]] #linear is the default activation
—> 12 comparator(summary(model2), alpaca_summary)
14 for layer in summary(model2):

~/work/release/W2A2/ in comparator(learner, instructor)
19 “\n\n does not match the input value: \n\n”,
20 colored(f"{a}", “red”))
—> 21 raise AssertionError(“Error in test”)
22 print(colored(“All tests passed!”, “green”))

AssertionError: Error in test

were you able to correct the code? could you post prediction layer code if you did crack it .thanks in advance

you create the prediction layer with the dense() function without (x) at the end

I did other part of the code

i solved it right after posting .Thank you for the help )

Thanks a lot boss, this solved the error for me.

Thanks a lot ,had been trying this for 2 days now…

Failing [‘Functional’, (None, None, None, 1280), 2257984]
it should be [‘Functional’, (None, 5, 5, 1280), 2257984]

What could be wrong?

ignore, I found the issue !

Hi I also have one related question for the prediction layer. Since it is a binary classification problem, why are we not using sigmoid activation instead, why linear?

Thanks a lot!