Bug in learning rate decay code

In Week 2 of course 2, the learning rate decay formula is given as

So, we find the new learning rate using the initial learning rate every time.

However in the coding exercise (“Optimization Methods”), we are passing previous epoch’s learning rate to calculate its new value.

So, I think this is wrong. We need to save initial learning rate as learning_rate0 and then find the new value of learning rate by passing learning_rate0 as parameter to decay() function.

1 Like

Hi @rahul_rewale, great catch!
Yes, this is a bug.
It has been reported and a fix will be available soon.