In screenshot below, to normalize x1 and x2, make sure the normalized result in between [-1,1], shouldn’t we divide x by sigma, instead of sigma square? Not sure if this a typo or my misunderstanding.
I tried to test this use simplest sample:
x1 = [1,3], x2 = [1,5];
After subtract mean, x1 = [-1,1], x2=[-2,2]
Sigma square (x1) = (1+1)/2 = 1
Sigma square (x2) = (4+4)/2 = 4
if we divide by sigma square, it will make x1=[-1,1], x2=[-1/2,1/2], in this case x2’s variance is not around 1?
In the context of normalization it should be not squared, but maybe there is a different context here which I can’t remember right now because its been a long time.
Hello @HuiminShi ,
It should be sigma that we want to use here, not sigma squared.
Below is a screenshot of the lecture which is using sigma. Did you get your screenshot from the slide PDF file? Note that only the lecture video is maintained but the PDF files are not.
Cheers,
Raymond
Thank you so much Raymond, that explains. Yeah, I was reading the pdf file for this part earlier and got confused here. Now it’s all clear now, thank you! 
1 Like