In a logictic regression graph we have a single node represrnting z = wT × x + b in a layer
But in a single layer of neural network there were four nodes :
z_[1]1 = w[1]1 T × x[1]1 + b[1]1
z[1]2 = w[1]2 T × x[1]2 + b[1]2
z[1]3 = w[1]3 T × x[1]3 + b[1]3
z[1]4 = w[1]4 T × x[1]4 + b[1]_4
[1], => represents 1st layer
1,2,3,4 represent nodes of these layer…
Why we have multiple nodes in a layer in a neural network where as a single node for same equation in logistic regression
A neural network is a different (more general) architecture than logistic regression. You can think of LR as being just the output layer of a Neural Net. That is the point that Prof Ng is making when he titles the Week 2 exercise “Logistic Regression with a Neural Network Mindset”.
1 Like
Ya I got the point but can you explain what is the purpose of having multiple nodes in a single layer of NN and just one node in a single layer of LR
The point is what I said earlier: LR is just the output layer of the NN, so it has to have one output, which is the “yes/no” classification result, right?
In the earlier layers of a Neural Network, the power comes from the fact that you can have lots of outputs from each of the layers. That enables the network to learn to detect lots of different things in the inputs: each neuron will be “specialized” during back propagation because of Symmetry Breaking. The whole point is that you don’t want them all to learn the same thing. E.g. in the case of images, the inputs are incredibly complicated and contain lots of different shapes, edges, curves, colors. You need to be able to detect lots of different low level features and then put those together in the later layers to recognize more complex features like a cat’s ear or tail or whiskers.
If each layer only had one output, then there’s really not a lot of point in having multiple layers. Draw the picture by analogy with the network diagrams that Prof Ng shows. Not very interesting with only one output per layer, right?
1 Like
okay. Now I understood it clearly. Thanks a lot for the great explanation.