Batch size training and during forecast different

C4W3_Assignment : there are 2 parts inside it

Compiling the model

In this part we train the model with a batch size = 34 (with epochs=50)

Faster model forecasts

In the model_forecast function we have this : ds.batch(32).prefetch(1).
So the batch size used is 32.

question

34 and 32 are different batch sizes used respectively during train and forecast.
Should the size not be the same ? how is that possible ?

Batch size matters only during training since it contributes to the magnitude of the gradient in the backward pass. At inference time, you can have any batch size.

Consider enrolling in Deep Learning Specialization to better understand the mathematics behind backward pass of a neural network.