An “epoch” is one processing cycle of the whole training batch.
ChatGPT gives this more scholarly definition:
In neural networks, an epoch is one complete pass through the entire training dataset. During an epoch, the model processes each training example once (or in mini-batches) and updates its weights based on the loss function and optimization algorithm.
@ai_is_cool I think the answer is given by @dtonhofer.
Now as you quoted “Andrew Ng did not define it in the previous videos, but for current and future students taking the Machine Learning Specialization, it’s essential to have a solid grasp of the basics. This ensures a smoother learning experience and better understanding of advanced concepts.”
As it is not possible in each course to define each terminology again and again. Hope you understand it.
Also, I totally agree with what @TMosh has mentioned."
i would have suggested to complete deep learning specialisation as data/dataset related terminology is vastly covered in DLS.
For this very same reason I recommend first DLS and then MLS, so you can also stick to your course video criteria and knowing what is batch or epoch as in one of your previous topic discussion it was discussed as far as I remember.
in simple terms, if you 10000 dataset and batch size is set to 100, then dataset is divided into 100 batches each containing 100 samples, and epoch is set to 10, then each epoch cycles is complete run of 100 batches containing 100 samples, that is complete training of dataset samples (10000).
Hope this helps. The above example is given from course provided Prof.Ng but in Deep Learning Specialisation.