The story is this:
- You start off with x_1 only, and y of course.
- You think some non-linear term helps, therefore I suggested x_2 = x_1^2 and x_3 = x_1^3 as examples
- You use a calculator to compute x_2 = x_1^2
- You use a calculator to compute x_3 = x_1^3
- Now we have x_1, x_2, x_3.
- We have three input features.
- We have three weights and one bias.
- We have y = sigmoid(w_1x_1 + w_2x_2 + w_3x_3 +b)
- That is logistic regression.
- If you want to discuss y = sigmoid(w_1x_1 + w_2x_2 + w_3x_3 +b) in the language of neural network, it’s ok. It is one neuron that accepts three FEATURES, including the x_1 and the x_2 and x_3 that are generated based on x_1. They are THREE features.
- That neuron is in a layer called “Output layer”, because it produces the output. It is NOT the hidden layer. A hidden layer is NOT an output layer. These are names.
- That neuron has a sigmoid as its activation
- point number 8, 9, and 10 are when you want to discuss a logistic regression formulation in terms of the vocabulary of neural network
Is the flow clear?