w1-Forward prop- w,b

AS I have seen w,b are added to neutralize the X , but here in the text it is second time I have seen and I didnot understand what is W1_1,W1_2,W1_3 ,for b’s as well it is elaborated for a single X having 2 input features? I thought only for 1 feature there will be 1 weight to balance?

For in X a feature X1,1 in 2 units has 3 w1_1,w1_2,w1_3 ,why is this added at a new neuron for the same input X a new w,b values ?

The answer is the number of neurons. It is not necessary that you have two features and to have only w1 and w2. The size of w depends on the number of neurons in a layer. If you have ten neurons, then you have w1 for the first neuron, w2 for the second, and so on till w10 for the 10th neuron.

thank you so much sir for making me understand.
If anyone else still need anything from the slides please check the demand prediction slides for further intuition if needed.