Why train on 2 epochs

I might be missing the point of the assignment of week 2, course 2 (the convolution network one). We are asked to perform data augmentation( i understand that part), but then only train for 2 epochs? What good can come from 2 epochs? All the previous assignments so far have “taught” us to incorporate a callback to stop training when some metric was reached - but here, it seems that there is no objective - instead, only the acceptance that “the grader might run out of memory” unless we use 2 epochs for training (or a batch_size of 10).

Can someone please help me, if the epoch count is due to grading limnitation, or is a pedagogical parameter?
Thanks!
Eyas

1 Like

Grader can run out of time if you train too long. You’re right about the callbacks.Callback approach.

I’d say it is more of a financial decision. Computation costs $. If the point of the exercise is just to show something runs, then 2 epochs can demonstrate that as well as 1,000 without incurring unnecessary costs and tying up constrained platform resources. You’re correct both that 2 epochs is unlikely to be chosen for a real solution and that accuracy trends can be measured in callbacks to dynamically alter training duration.

Thanks. I’ve finished the Deep Learning Specialization last week. It was extremely amazing - the content was out of this world. I was under the impression that the Tensorflow Developer Professional one would be quite advanced as well. Hence my question - the previous experience with Andrew Ng’s course, where you had to go as deep as write your own functions (think backprop and cells of LSTM) put me under the impression that this one would be more meaningful and guided.

So @Eyas_Taifour this course is about mastering the techniques of Tensorflow
Its necessary to know the workings of the packages you use but you everytime can’t keep on implementing that yourself and hence Tensorflow PyTorch have been made
Comparing this course to DL specialization is like comparing apples to bananas and I wont prefer that :slight_smile: