I have a question about the weight initialization in the layers

I have been reading the book [Hands-On Machine Learning with Scikit-Learn…] and in one of its questions is the following: [Is it OK to initialize all the weights to the same value as long as that value is selected randomly using He initialization?]

I understand that if I choose all the weight layers to the same value, the neural network behaves as if it has only one layer. But my question is the following: if I initialize all the weights to the same value and then apply He initialization, what happens to the values that were set to zero after applying He initialization?