Loss suddenly jumps up

Hello!

I’m creating my own project with little datasets. I have seen a sudden jump in loss when i apply transfer learning using a very deep network. What could be the issue here?

When does that happen in the begining of fine tuning or in the middle? And how many images are you using for fine tuning?

50 images, 40 for training, 10 for validation. 6 classes.
It occurs in the almost at the end of the fine tuning (30th epoch of 50 epochs)
Thanks!

Are you going every epoch through all the training dataset, or a different subset for each epoch? Also hows does that rise to the initial loss compare to, is it bigger than the initial loss or half way through…etc. What happens after the 30th epoch?

I’m using all the datasets for every epoch! Here is a picture of my loss function!

From a theoretical point, when you start fine tuning you are changing the position of the pretrained optima. As you continue training you might oscilate and jump into a “cliff” rather than a “valley” so it could happen during fine tuning.

I see that after the 30th epoch the loss starts falling again and overall the loss decreases. But its important to note that the complexity of high dimendional spaces where these models reside makes it hard to fully understand completely their behaviour.

1 Like