ADAM Advanced Optimization Algorithm

MLS Course 2 Week2: Advanced optimization
From the screen shot with earlier timestamp: Adam would have as many learning rates as the number of weights in the Neural network + one learning rate for b.

In the later screenshot of the same slide: it shows that the learning rate initiated was only 1 single value.(?) Expected to see multiple intiialized values.


I understand that Adam indepth discussion is reserved for advanced Deep learning courses. Trying to understand keras implementation discussed in here.
Thanks

Did you try this?

Thanks gone through it (& some additional internet material).
So, starting it would be one learning rate value but then it shall be adjusted and (re)adjusted based on first moments(1st derivative cost function w.r.t to different parameters) & second (rate of change of the cost function with new changes in parameters) moments.?