In week 1 of DLS Course 2 : Normalizing Inputs

There was

Correct me if I am wrong.

This value here is wrong:-
x /= sigma **2
It should be
x /= sqrt (sigma **2) I mean while coding the logic we have to sqrt the above value ryt?

Hi GAURAV2567,

Let me try to answer your question, although I am not sure if I fully understand it.

The standard deviation sigma is defined as the square root of the variance, with the variance defined as the expectation of the squared deviation of a variable from its population mean. Variance can therefore be referred to as Var(X), but as it is equal to sigma squared it can also be denoted as sigma**2. So, the standard deviation is always positive, and thus it is possible to write x /= sigma - as in the slide at 1:26.

To calculate sigma you have to take the square root from the variance, as implied by the slide. So while coding you would take the square root of the calculated variance (var(X) or sigma squared), which is sigma. I believe this is what you suggest.

I hope this answers your question.