Error in Optional lab: feature scaling and learning rate

This has already been mentioned here, but I just wanted to further emphasize that the text should corrected.

As you can see in the above picture, when the alpha rate is 9e-7, by the 20th iteration the cost is still high at about 20,000. In the next example, the alpha rate is reduced by a factor of 9, and this is what the graph looks like:

Hopefully you can see that the convergence rate has actually significantly increased by reducing the learning rate. Yet, the statement highlighted explicitly says otherwise.

I understand the intuition behind this, and if you’re interested, an explanation is linked in the post above. I just wanted to call to attention the statement highlighted as it is misleading. I hope to see it fixed soon!

Thanks for your report. I’ll investigate and submit a support ticket if needed.

1 Like