Neural Network Units and Layers

In NN, I understand that its first layer has units equal to the number of features of the input. Units basically are the weights of each feature. So, at each layer we can not vary units as its all decided on its input.

Is my understanding correct? If no, please clear my doubt.
Thanks in advance!


The number of layers and units inside each hidden layer is something that you define. I invite you to read this other post where there is a more robust answer about this exact topic:


Hi, @Mudita_Bansal !

Beware that the weights are not the number of units but the “arrows” that connect units from previous layer to each layer. You can always decide how many number of units each layer has (except input unit that will have input shape), so the weights will be (n_{input} + 1) \cdot m_{units} values, given that you are using dense layers

1 Like