Question about computing variance for normalization

Hi all,

I have an ABC question about computing variance.
As I know, the variance sigma^2 should be:
f1: sigma^2 = 1/(m-1) * sum((x - μ)**2) → All variable Xi should substract the mean;

But in the slides, the formula is :
f2: sigma^2 = 1/m * sum(x**2) → without subtracting the mean.

Are there any mistakes in my formula f1? If f1 is ok, why does the formula f2 work?
Thanks a lot!

If you check out on left bottom, what is done here is that, first we subtract the mean from our x and then we use that x and square it. So overall it is same as your f1, it’s just that here it is broken into two different parts and solved.

1 Like

Thank you so much! I misunderstood it.
But why does it use 1/m rather than unbiased estimate 1/(m-1)?

Not sure about that, but I think here we consider the training set as the population and then make assumptions using that hence 1/m. Though using 1/(m-1) won’t harm.

Make sense. if assume that the training dataset is the population, the 1/m is already unbiased.

Thank you.

1 Like