One simple example, lets say I have 1000 samples, and each mini-batch is 100, so I got t =10 mini-batch here.

In the picture above is from the function “`model`

” which is directly given by the HW from “Optimization_methods”. Please advise my questions below:

Q1, if num_epochs = 2, the WHOLE 1000 training set will be passed 2 times, because the inner loop “`for minibatch in minibatches:`

” will pass the WHOLE 1000 sample one time. Yes or No ?

Q2. if Q1 is yes, then 1 epoch pass through the whole sample set (1000 here), NOT one mini-batch 1000 ? Yes or No ?

In the professor slides, my understanding is 1 epoch passes one mini-batch (100 here). I am kinda confused by the epoch definition. Thank you