[Mini-batch gradient descent] Did Andrew mean "epoch" instead of "iteration"?

In this video, at 7:49, Andrew said: “maybe one iteration does this”. Didn’t he mean that one epoch might do that? Isn’t mini-batch gradient descent taking m / batch_size steps for each iteration?

Hi, @paul2048.

Didn’t he mean that one epoch might do that?

That would be the case with batch gradient descent. With mini-batch gradient descent, you take m / batch_size gradient descent steps (i.e., iterations) per epoch. One epoch corresponds to one pass through the entire training set.

This video probably explains it better :slight_smile:

Let me know if that was helpful.

2 Likes

Hi, @nramon,

It makes sense now. I was confused about the names epoch and iteration.
But does that mean that in batch gradient descent you could (hypothetically) use those terms interchargebly since you will have just one iteration?

2 Likes

Rather than call them interchangeable, since they refer to slightly different concepts, I would say it takes one iteration to complete one epoch.

Good luck with the rest of the course :slight_smile:

2 Likes

Thank you very much, Ramón! :slight_smile:

1 Like