Week 1 Hyperparameter Tuning Questions

So a question I had is when you are at a stage of hyperparameter tuning, how do you know a reasonable amount of epochs for the tuning, especially for larger networks? We used a simple mnist as an example to understand the concepts but I am still wondering how certain steps would be performed in the real world. Let’s say for example that in your testing you have seen learning plateau at around 200 epochs and that this training period can take a week or so. If that is the case, how would one make decisions on the hyperparameter tuning space, especially the number of epochs to iterate to for each new network?

Please read about max_epochs and see how EarlyStopping callback is used as part of tuner.search.