Changing NN architecture/hyperparameters and orthogonalization

In this week’s course we learned about orthogonalization: that we should try to change our model in a way that affects only one aspect of the model.

Later we also learn we can try changing model architecture/hyperparameters to improve BOTH avoidable bias and variance. So this seems to violate the orthogonalization rule. How do we carry out the architecture/parameter search then?

1 Like

Hey Sara,
another good question :slight_smile:

The rule of thumb to deal with high bias first and only then work on high variance. I’ve described the decision-making process in detail here.

I assume that that switching the learning algorithm will have quite a dramatic effect, so you start the tuning process over again.

2 Likes