Initializing weights with 0

Hi,all.In the Neuaral Network and deep learning video, week 3, no 8.It is being said by professor that if we initialize all weights with 0, then all the hidden layer will perform the same.Can any one tell me what output will it produce if all hidden layers use sigmoid activation?As per my understanding all neurons should output 1, am I correct? Also from which ground professor saying that we 0 initialization is okay for logistic regression?

Hey @Shibdas_Bhattacharya,
Please state the title and time-stamps of the video which you are referring to.

I think you can simply train a neural network for this and find out for yourself. Just my 2 cents, if all the weights are 0, and assuming that the bias term is also 0, the output of the neurons in the first layer will be 0, and applying the sigmoid will lead to 0.5, not 1, and so on, it will progress further.

Cheers,
Elemento

I think you must be misunderstanding what Prof Ng is saying. In the case of a Neural Network in Week 3, you cannot initialize the weights all to 0: you need “symmetry breaking”. But in the case of Logistic Regression, the math is different and you can get away with all 0 for the weight initialization and it can still learn. Here is a thread which explains this in more detail and shows the mathematical difference between the two cases.

1 Like

Yes , Thank you for the explanation