In Week 3 's last lecture Prof Ng talks about how we can’t initialize weights as 0. My question is can we not specify the weights for each neuron in the layer as opposed to specifying one common weight for the entire layer?
Hello @Narayan
Kindly check the post Symmetry Breaking versus Zero Initialization. It might help you