Course 2 Week 1 Video "Normalizing Inputs" doubt regarding "m"

Course 2 Week 1 Video “Normalizing Inputs”
At 1:40 professor Ng says “Use same Miu and sigma for test inputs”.
while computing Miu and sigma both were divided by “m” (the no. of example in the training set ) but the test set will have different number examples (so different “m”)

My question is why should we use the same sigma and Miu for both the training set and test set.

Hi Abz,
Lets assume we don’t use the same sigma and miu we obtained at training time and use a new miu and sigma we compute across our test set.

At this case, we have trained our neural network based on B.N. using some miu and sigma and at training time we always have been providing these two parameters to our N.N. but when it gets to test time we take these two parameters from it.
Isn’t it like teaching a monkey to separate bananas and apples but when we are testing what he has learned, we close its eyes?