I have an ABC question about computing variance.
As I know, the variance sigma^2 should be:
f1: sigma^2 = 1/(m-1) * sum((x - μ)**2) → All variable Xi should substract the mean;

But in the slides, the formula is :
f2: sigma^2 = 1/m * sum(x**2) → without subtracting the mean.

Are there any mistakes in my formula f1? If f1 is ok, why does the formula f2 work?
Thanks a lot!

If you check out on left bottom, what is done here is that, first we subtract the mean from our x and then we use that x and square it. So overall it is same as your f1, it’s just that here it is broken into two different parts and solved.

Not sure about that, but I think here we consider the training set as the population and then make assumptions using that hence 1/m. Though using 1/(m-1) won’t harm.