About improving the classification

I have tried adding a layer in the Resnet Model as shown:

fuente ="https://tfhub.dev/tensorflow/resnet_50/feature_vector/1"
class ResNetModel(tf.keras.Model):
    def __init__(self, classes):
        super(ResNetModel, self).__init__()
        self._feature_extractor = hub.KerasLayer(MODULE_HANDLE, trainable=False) 
        #self._feature_extractor = hub.KerasLayer(fuente, trainable=False) 
        self._added_layer = tf.keras.layers.Dense(32, activation = "relu",trainable = True)
        self._classifier = tf.keras.layers.Dense(classes, activation='softmax')

    def call(self, inputs):
        x = self._feature_extractor(inputs)
        x= self._added_layer(x)
        x = self._classifier(x)
        return x

The test accuracy has diminished a lot.
When training with more epochs, the training accuracy rapidly arrives at 100%, but the test accuracy stucks at 61.7%, much worse that using the model without the added layer. I expected some improvement in the result. Where is my error?

The problem is probably with the number of neurons in the added layer. I think that number is too small to accommodate and process a flow of information coming from many more neurons in the layer before that. I guess the layer before that has much more neurons that this.

We had a similar question sometime ago. Why dont you try and increase the number of neurons in the added layer and see the effects!

thank you @gent.spah for your answer. You are right. Increasing the number of layers the model test_accuracy with the _added_layer outperforms the model without it. Nevertheless, both are severely overfitting, because the training_accuracy is very high (98%-100%). How could we cope with this overfitting?

To deal overfiting in general you need to split the dataset so that all sets ie. train, dev, set are reprentative of the distribution equally.

Increase the dataset using augmentations.

Use regularization such as drop out or others.

Sometimes a simplified network can bring improvement.

I suggested that you also increase the number of neurons in the added layer.