MLS-C2-Week1 Behavior of each neuron

Hello everyone and congratulations on creating this Specialization. I would like to ask you a question regarding the neurons inside a neuron layer. As I have understood so far, each neuron combines the features from the previous layer and creates more complex and sophisticated features. My question is if it is possible some neurons end up having the same weights and creating the same combination of features.

Thank you for your time!

Hi there,

Features are data fed into the neural net the transformations thereafter are not really called features. Yes some neurons can end up having the same weights but their output maybe same or different, because the output of a neuron has an upstream feed which in most cases is different for each neuron due the complexity of the model.

Thank you very much for your response and explanation!

1 Like


Your question reminds me of this research by Google.


1 Like

I had a similar thought and a question. I had the impression that all of the weights for the neurons in a layer had to be the same. I thought that we had to initialize weights with 0 (as one of the lectures suggested). Then I found out we should not, which makes sense!

It is mentioned in the lecture notes of CNN:

" Weight Initialization

Pitfall: all zero initialization. Lets start with what we should not do. Note that we do not know what the final value of every weight should be in the trained network, but with proper data normalization it is reasonable to assume that approximately half of the weights will be positive and half of them will be negative. A reasonable-sounding idea then might be to set all the initial weights to zero, which we expect to be the “best guess” in expectation. This turns out to be a mistake, because if every neuron in the network computes the same output, then they will also all compute the same gradients during backpropagation and undergo the exact same parameter updates. In other words, there is no source of asymmetry between neurons if their weights are initialized to be the same. "