Hi Sir,
From the lecture mini batch gradient descent we had two doubts can u please help to clarify ?
At 8:05 mt, proff told that mini batch descent does not always exactly converge or oscillate in a very small region. Here in this statement, Does not always converge means should be oscillation around the minimum right sir ? We dont know why proff says does not oscillate also .
Second doubt is, if the algorithm is wandering around the minimum means, can we use small learning rate or reducing learning rate will help to converge to the global minimum ?
Thanks,
Thayanban