DLS4 Week1 Epochs and batches

Just a question regarding the training of the ConvNets in Assignment 2 Week1 part 1.
In part 1, epochs=10, batch_size=16. We have in total 600 training pictures. Does this mean that we run through all the 600 pictures (sets) ten times and that the weights are updated after every 16th training set?
Giving 600*10/16 updates?

I tried to read up on batches and epochs on the internet, but it is bit of a mess in how people use “epoch” and “batch”.

Thanks.

epoch

  1. This is the number of times you iterate over the entire training data.
  2. 16 epoch means you iterate over the training set 16 times before ending training.

batch_size

  1. Number of training examples that the model uses for a gradient update.
  2. If there are fewer than batch_size data points in the last iteration, the remaining samples are used during the last iteration. This is why you have 38 batches of data processed per epoch since 600 / 16 = 37.5