Week2, logistic regression 2 questions

@Igor_Goldberg,

The story is this:

  1. You start off with x_1 only, and y of course.
  2. You think some non-linear term helps, therefore I suggested x_2 = x_1^2 and x_3 = x_1^3 as examples
  3. You use a calculator to compute x_2 = x_1^2
  4. You use a calculator to compute x_3 = x_1^3
  5. Now we have x_1, x_2, x_3.
  6. We have three input features.
  7. We have three weights and one bias.
  8. We have y = sigmoid(w_1x_1 + w_2x_2 + w_3x_3 +b)
  9. That is logistic regression.
  10. If you want to discuss y = sigmoid(w_1x_1 + w_2x_2 + w_3x_3 +b) in the language of neural network, it’s ok. It is one neuron that accepts three FEATURES, including the x_1 and the x_2 and x_3 that are generated based on x_1. They are THREE features.
  11. That neuron is in a layer called “Output layer”, because it produces the output. It is NOT the hidden layer. A hidden layer is NOT an output layer. These are names.
  12. That neuron has a sigmoid as its activation
  13. point number 8, 9, and 10 are when you want to discuss a logistic regression formulation in terms of the vocabulary of neural network

Is the flow clear?

1 Like