Difference between epoc and iteration

I can’t clearly understand different between epoc and iteration

You don’t give a reference to which course and lecture you are talking about, but one place that issue comes up is in DLS C2 W2 where Prof Andrew introduces us to “minibatch” gradient descent. The idea of that is that you can get convergence more quickly if you subdivide your entire training set (“batch”) into smaller mini-batches, typically of size 16 or 32 or 64. Once you subdivide it like that, you run two levels of loop: the inner loop (one “iteration”) is over one minibatch and the outer loop covers all the minibatches (one “epoch”). So an “epoch” means a complete pass through the entire training set, whereas an “iteration” is one step in the epoch for one minibatch.

@paulinpaloalto
Thank you my friend exactly I talk about DLS

It’s great that my answer was relevant. The next level of subtlety is that in each epoch, the usual practice is to randomly shuffle the training set and recreate the minibatches on each epoch. The purpose is to smooth out the statistical behavior as much as you can.