Learning Rate Decay vs Weight Decay

In the Week 2 Programming Assignment “Optimization_methods” near the end of it there are some exercises on learning rate decay. However, some of the explanations/descriptions mention weight decay. Consider “7.3.2 - Gradient Descent with Momentum and Learning Rate Decay” in the assignment as an example. I don’t think weight decay and learning rate decay are the same thing, but am I wrong and they are the same thing?

Hello @Ayush_Nigade

We implement Learning Rate Decay using the method of “Exponential weighted decay”. Decaying learning rate is the objective, whereas “exponential weighted decay” is one of the algorithms and the chosen algorithm to implement that objective.

Cheers,
Raymond

1 Like

thanks!