help,I have a question about adding more picture to your training example

I have a question about adding more picture to your training example.
More example means more times for iteration
But Ng has said that when you train your model for more times,it will lead to overfitting.
In that case,more examples lead to overfitting.

You need to be precise about the use of iteration versus epoch. An epoch is one pass (forward and backward) of the complete data set through the algorithm. Depending on the data set size, there can be multiple subsets of the data passed through the algorithm per epoch. If your data set is 1000 and batch size is 500, it will take 2 iterations to complete an epoch. Increasing the number of epochs can result in overfitting. Increasing the number of iterations, for example by reducing batch size, or increasing the training set size, does not.

Adding more training samples does not increase overfitting: in fact it is one of the specific strategies that you can use to combat overfitting. Prof Ng discusses this in some detail in the lectures about how to tune models and deal with high bias and/or high variance.

It’s just that adding data is not always a cheap or easy option. Of course everything is a tradeoff: more data means higher training and storage costs. So you have to make all these decisions based on the performance results in terms of the accuracy of your model. How important is better accuracy and what cost does that justify? If an incorrect prediction from your model has potentially life threatening consequences (e.g. it’s controlling a moving vehicle of some sort or predicting whether a medical image contains a tumor), then it may well be worth quite a bit of added cost to get more accuracy.

Thank you for your guidance.I finally know the difference between iteration and epoch .

1 Like

Thank you .When Prof Ng talked about “early stopping”,I thought more samples means more iterations which would lead weight(W) to increase.
Now I know that more samples enhances the ability of the model to generalize.Just as you said" Adding more training samples does not increase overfitting"

1 Like