Week 1 : Optonal Lab : Simple Neural Nework : Omission of term

In the Optional Lab : Simple Neural Network, I read the following;

“Tile/copy our data to increase the training set size and reduce the number of training epochs.”

The term “…training epochs” has been introduced here in the lab but Andrew did not define it in the previous videos so I have no idea what this is.

Perhaps the developer of the lab could include an explanation of this term for current and future students to understand what this means.

An “epoch” is one processing cycle of the whole training batch.

ChatGPT gives this more scholarly definition:

In neural networks, an epoch is one complete pass through the entire training dataset. During an epoch, the model processes each training example once (or in mini-batches) and updates its weights based on the loss function and optimization algorithm.

What is a “batch”? Again a term not defined in the previous video lessons.

Please only use terminology which has been defined in the video lessons for this particular course up to this optional lab.

That’s really going to limit the number of people who are willing to reply to your questions.

@ai_is_cool I think the answer is given by @dtonhofer.
Now as you quoted “Andrew Ng did not define it in the previous videos :laughing:, but for current and future students taking the Machine Learning Specialization, it’s essential to have a solid grasp of the basics. This ensures a smoother learning experience and better understanding of advanced concepts.”
As it is not possible in each course to define each terminology again and again. :blush: Hope you understand it.
Also, I totally agree with what @TMosh has mentioned."

One person is sufficient if they have completed the course successfully or at the same stage week 1 as I.

hi @ai_is_cool

i would have suggested to complete deep learning specialisation as data/dataset related terminology is vastly covered in DLS.

For this very same reason I recommend first DLS and then MLS, so you can also stick to your course video criteria and knowing what is batch or epoch as in one of your previous topic discussion it was discussed as far as I remember.

in simple terms, if you 10000 dataset and batch size is set to 100, then dataset is divided into 100 batches each containing 100 samples, and epoch is set to 10, then each epoch cycles is complete run of 100 batches containing 100 samples, that is complete training of dataset samples (10000).

Hope this helps. The above example is given from course provided Prof.Ng but in Deep Learning Specialisation.