Error in illustration in video "Optimization using Gradient Descent - Least squares with multiple observations"

It should be 2n not 2m

Also the narration says ‘m’ instead of ‘n’

Hello, @toontalk Thanks for reporting. Please send the link and the timestamp of te above error

https://www.coursera.org/learn/machine-learning-calculus/lecture/1HQXp/optimization-using-gradient-descent-least-squares-with-multiple-observations

Minute 4:00

Hello @toontalk The issue has been reported. Thank a lot
If you come across any other error feel free to open another new topic

Hi, I have two other small errors to report, again in the same video lesson, approximately from the second 4:35 onwards.

The cost function L(m,b) is called L1(m, b) just below. when the gradient is made.

Furthermore, the various lines that gradually approximate the optimal one, instead of being written as y = mi * x + bi, where (mi, bi) are the coefficients calculated at the i-th step of the gradient descent, are written as y = m * xi + bi.