DNN video | Choosing new learning rate (1:58)

How did he calculate or choose the new learning rate when he said:
“When it’s done we can then plot the results of the loss
against the learning rates.
We can then inspect the lower part of the curve before it gets unstable.
And we’ll come up with the value.
In this case it looks to be about two notches to the left of 10 to the minus 5.
So I’ll say it’s 8 times 10 to the minus 6, or thereabouts.”

Hi @talhairfan,

There’s not that big difference in convergence if you select 7 or 8 to 10E-6. Some differences in values when you run your code and what author executed it can be due Neural networks use randomness by design.

You can skip these and make things reproducible adding this at the beginning of your code - this is TF version dependent so check yours:

from numpy.random import seed
seed(1)

from tensorflow.random import set_seed
set_seed(2)

Anyway, as you have history, you can search the minimum in the values stored there just adding some code - I kept it simple but you can make it as complex as you want. :slight_smile:

import pandas as pd

result = pd.DataFrame(data=history.history["loss"],index=history.history["lr"],columns =['loss'])

print(f"Loss is {result['loss'].min()} with learning rate {result.idxmin()}")

Give it a try!

Best,