When it comes to RMSprop and w and b are shown as two sets of paramenter that me be separated by showing that one moves towards the optima and b is the oscillations vertically, I was wondering how are we to choose these two sets?.
When we implement it would we not have a function that tells us which parameters would be in W and which in the b part as shown in the video.
Alpha Rate Decay.
As the iterations increase wont the dw and db also decrease? Like would’nt each update be less that the previous after a particular time?(especially when we are nearer to the optima) This was mentioned in the ML course by Andrew Ng.
If so, do we even need to decay alpha? Or are we doing the deacy due to the property of mini batch-grad descent that is to not converge?
Apologies for the delayed response. Coming to your first query, I think that there is a slight confusion as to what
b are. They simply represent the weights and biases for the current mini-batch respectively, for which we compute the gradients. The example in which Prof Andrew has shown weights along the horizontal axis and biases along the vertical axis is simply an example that he provided for intuition and nothing else.
As for your second query, the statement that “As the iterations increase, dw and db will highly likely decrease” is true. But when we are near to the optima, it is best to take small steps, and that is what Learning Rate Decay might help us to ensure. Also, as you have mentioned, when we use Mini-Batch Gradient Descent, we get influenced by the local batch optima, which might not be aligned with the entire batch optima, and in this case, Learning Rate Decay will help us to ensure that we don’t move further from the desired optima.
Note here the use of “might”, cause this is what has been used by Prof Andrew as well. " One of the things that might help speed up your learning algorithm is to slowly reduce your learning rate over time". It is just one of the strategies that may help in some cases, and may not help in other cases. You can only try and find. I hope this helps.