Here in this figure the slope is decreasing constantly i.e. d(J(w))/dw due to which although Alpha is constant gradient descent changes hence it slows down near minima as near minima slope tends to 0.
The learning rate (alpha) is a constant.
Since the derivatives (the slope) reaches zero at the bottom of the curve, the progress toward the minimum keeps decreasing.
Learning constant alpha is constant but derivative is changing so gradient descent is changing in first slide also. But it is only talked about alpha there which is constant in both slides