Hello,

While considering the ML iterative strategic approach in the spirit of orthogonalization, we would quickly set up a model for training and try to work on reducing avoidable bias. If the problem is not with the measure of avoidable bias. Next, we would look at the variance problem and we would reduce the variance using related tactics. I understand this is an iterative process. Each time we would try out different tactics until we reduced either of the problems.

In this scenario, where would we fit the hyperparameter search process that we have seen in C2, W3? Like Grid search or random search for identifying best hyperparameters?

I am confused at which step we would perform the model selection process.

Once we get the final best model that performs well on training and dev sets i.e. the model thatâ€™s not suffering from bias or variance problems, we would again consider doing a coarse to fine search of hyperparameters (using random or grid search techniques) to find out the best hyperparameters for this best model we found so far to get the even better model?

Or, while doing the iterative process of reducing bias and variance problem itself, we would do the model selection process?

Best Regards,

Bhavana