C4W2-Coding assignment 2: fine tune problem: why the initial epoch is 5/10?

I have a question, why in Fine tune the training starts from 5/10 epochs instead of 6/10? because in my opinion if we set “fine_tune_epochs = 5”, the training in Fune tune should be: “epoch 6/10, 7/10, 8/10,9/10,10/10”

Does this mean that the 5/5 epochs we trained before have been covered by Fine tune?

I believe it’s because of this parameter:
image

This causes the fine-tuning to start from the last previous epoch (number 5). So it’s going to run for six cycles (epochs 5 though 10). Since we previously did 1 through 5, we’re actually doing 11 training cycles.

2 Likes

Thanks for your answer! I think we set initial_epoch = history.epoch[-1] is to make sure that the learning curves between transfer learning and fine-tuning are connected to each other, right? (because if we set initial_epoch = history.epoch[-1] +1, the learning curves will have a gap between 5/5 and 6/10?)

I do not know for certain, I have not experimented with it.

Agreed! Excellent explanation!