Hello! I’ve looked up the second video in Course 3 Week 1 of Deep Learning Specialization. I’ve realized that i can’t get what does Orthogonalization. So let me do what i understand about it and please if you guys can read it and correct the mistakes that i’ve done.
Summary:
We can take a Old TV Example, where each knob tune the height of the image, the width, the rotation and so on. We don’t want a knob that basically controls all of that but have each knob tunes different things. So Orthogonalization in this example means that we need each knob that affect only 1 thing. Is that correct?
Correct. Orthogonalization refers to each knob affecting one aspect of the TV. Similarly, in machine learning, orthogonalization refers to the set of options you have to control one aspect of the model.
Thanks for your answer!