Why do we need to specify the batch size for validation generator

In C1_W4_Lab_3_compacted_images.ipynb:

train_generator = train_datagen.flow_from_directory(
        './horse-or-human/',  # This is the source directory for training images
        target_size=(150, 150),  # All images will be resized to 150x150
        batch_size=128,
        # Since we use binary_crossentropy loss, we need binary labels
        class_mode='binary')

validation_generator = validation_datagen.flow_from_directory(
        './validation-horse-or-human/',  # This is the source directory for training images
        target_size=(150, 150),  # All images will be resized to 150x150
        batch_size=32,
        # Since we use binary_crossentropy loss, we need binary labels
        class_mode='binary')

Why do we need to specify the batch size for validation generator? Batching is only used in training, right?

Hi Lei Guo,

You are correct, batching is mainly used in training. However, there is a provision to specify the batch size in validation step because sometimes it is not possible to load the entire validation set into memory in one go. So you can still run the validations but using small batches of data at a time which could fit into memory.

Hope this helps.

1 Like