Mini-batch Gradient descent

Hi,

Does mini-batching help in anything else other than computation speed?

The benefit of minibatch gradient descent is that the parameter updates happen much more frequently. So it tends to speed up convergence, meaning that you get to a useful solution with less total “epochs” and thus a lower total compute cost and also less wall clock time. So I believe that the answer to your question is “yes”. The only benefit is lower total compute cost. But that is not a trivial thing: training large models on huge datasets is very expensive and time consuming. Anything you can do to mitigate that is a big deal.

hi Mr paulinpaloalto
thank you
how use minibathch in python?

There is an assignment which includes this logic in DLS Course 2 Week 2. It’s the Optimization Assignment in C2 W2.