is there any standard size to select for batch_size, of course the batch_size will depend on the total number of images in the train and validation set. So, my 2 questions are:-
-
Is there any ideal/standard ratio of batch_size with respect to total images?
-
other than the change of speed of training what other performace changes if any, happen when we change the batch_size ?
Hello @Rishabh_Singh2
As of my understanding, the answer is No.
The batch size determines how many training examples are processed in parallel for training/inference. The batch size at training time can affect how fast and how well your training converges. For train_batch_size
, it’s worth picking a batch size that is neither too small nor too large. For some applications, using the largest possible training batches can actually be desirable, but in general, you select it through experiments and validation.
However, for validation_batch_size
and test_batch_size
, you should pick the largest batch size that your hardware can handle without running out of memory and crashing. Finding this is usually a simple trial and error process. The larger your batch size at inference time, the faster it will be, since more inputs can be processed in parallel.
The Batch size effects the Computational speed and the Speed of convergence of an algorithm