I am trying to understand what is the BATCH_SIZE, and BUFFER_SIZE introduce in Lab3 in section Training the model?
BUFFER_SIZE = 10000
BATCH_SIZE = 64
Get the train and test splits
train_data, test_data = imdb_subwords[‘train’], imdb_subwords[‘test’],
Shuffle the training data
train_dataset = train_data.shuffle(BUFFER_SIZE)
Batch and pad the datasets to the maximum length of the sequences
train_dataset = train_dataset.padded_batch(BATCH_SIZE)
test_dataset = test_data.padded_batch(BATCH_SIZE)
print(train_dataset.take(1))
In addition, when trying different epoch 10 → 15. The loss is getting worse over the number of epoch. what is that?