Difference between epoch and iteration

hi, Could you please tell me if the epoch and iteration are essentially meaning the same ?

Thanks.

Venkaatesh

Hi Venkaatesh,

Well, not exactly. Iteration means how many passes (forward + backward) the algorithm has performed. Epoch on the other hand means how many times the algorithm has seen/been-trained-on the entire dataset.

In case of Batch gradient descent, where you pass the entire dataset in one batch to the algorithm, 1 iteration = 1 epoch as in each iteration, the algorithm sees the entire dataset. However, in mini-batch gradient descent, the dataset is broken up into multiple mini-batches and each iteration just processes 1 mini-batch of the dataset. So, there’ll be multiple iterations in one epoch in this case.

Hope this helps.

2 Likes

hi Somesh,

Many thanks, so can i say that no of iterations is equal to no of mini batches of training set ?

if so, when we plot cost vs. iterations, in batch gradient descent, is it like cost vs. (iterations * epoch) ?

Hi Venkaatesh,

Yeah, number of iterations is equal to the number of mini-batches.

In batch gradient descent, as I have mentioned, there is only 1 batch per epoch and thus 1 iteration per epoch, so it’s still cost vs. epoch as iterations is 1 over here.

Usually when we plot training loss, we plot it at an epoch level than at an iteration level. In case of mini-batches, the fluctuation in loss per mini-batch (iteration) will be very high.

Thanks a lot Somesh. it was very helpful. Have a good day.