In this video, at 7:49, Andrew said: “maybe one iteration does this”. Didn’t he mean that one epoch might do that? Isn’t mini-batch gradient descent taking m / batch_size
steps for each iteration?
Hi, @paul2048.
Didn’t he mean that one epoch might do that?
That would be the case with batch gradient descent. With mini-batch gradient descent, you take m / batch_size
gradient descent steps (i.e., iterations) per epoch. One epoch corresponds to one pass through the entire training set.
This video probably explains it better
Let me know if that was helpful.
Hi, @nramon,
It makes sense now. I was confused about the names epoch and iteration.
But does that mean that in batch gradient descent you could (hypothetically) use those terms interchargebly since you will have just one iteration?
Rather than call them interchangeable, since they refer to slightly different concepts, I would say it takes one iteration to complete one epoch.
Good luck with the rest of the course
Thank you very much, Ramón!