W2 - Logistic Regression as a simple NN

Towards the end of lecture videonon Logistic Cost Function Andrew mentions:

“It turns out that logistic regression can be viewed as a very, very small neural network.
In the next video, we’ll go over that so you can start gaining intuition about what neural networks do. So with that let’s go on to the next video about how to view logistic regression as
a very small neural network.”

However, next video is on Gradient Descent. In fact all the videos in this W2 do not touch upon Neural net - a topic that starts in W3. That “Logistc Regression is like a single node NN with single output” is mentioned in lecture and then also in Programming assignment – but why is that so is not “explicitly discussed”.

Is there a video missing on the topic above?

Hello @dds,

I might not be the best person to comment about the organization of the course, and I am fully aware that you have a different perference about it given that quote and that you recognized W3 as a proper introduction but W3 is, of course but unfortunately, after W2.

Before I write this reply, I have visited the videos in W1 and W2. W1 gave a simple idea of how a neural network would look like and in particular this slide shows a NN.

Of course I didn’t see anything similar to that in the W2 videos, but if we go to assignment 3 in W2, it did explain the logistic regression as neural network with the following figure.

image

And throughout Andrew’s lectures in W2, he kept using the term “neural network” when he explained the components of logistic regression. For example, he mentioned backward propagation when he went through this computer graph:

I think there is a strategy on how the material is presented, and from my own perspective, I think the course team is presenting logistic regression in W2 because it is always good to start with something simple and with an activation function. However, the week won’t be just logistic regression but from time to time we should be notified of how this or that part of the logistic regression algorithm is related to a neural network and Andrew did talk about that. Moreover, at the end, the course team prepared an assignment which included the graphical representation that I attached in the above to show how it looks like a NN. If we look at W2 as a whole, I think it has served the purpose. Then W3 comes and reinforces our memory and understanding of different components of a NN, and not to mention W3’s assignment is revisiting it through a 2-layer binary classification problem.

Again, I was not part of the course team, and I wasn’t a DLS mentor since the launching of this specialization, so I probably am not the best person to answer this, but at least I think it’s a reasonable strategy to go from logistic regression to neural network through W2 and W3. It is natural that different learners prefer different strategies and if you were to design this course, I guess you probably would have introduced directly a 1-layer NN that solves a binary classification problem instead of introducing the logistic regression which is historically not a neural network. However, I believe the course team has made the best decision.

I understand my reply won’t change the way you believe to be the best way to learn NN, but I hope at least I can give you my perspective just as you have given us yours.

Thank you, and cheers,
Raymond :slight_smile:

2 Likes

Dear Raymond,

Thank you so much thoughtful and detailed response. I have no issue with the organization of the course at all! As you may have noticed Andrew often includes an intuitive discussion on deep/fundamental topics such why does the loss function of logistic regression make sense (W2 Lecture 8) or Why is DL taking off now (W1 Lecture 4) or Why deep representation work better than shallow networks ( W4, lecture 4). So I was looking forward to his discussion (given the teaser at the end of the W2 Lecture 3. Thanks for including the Housing Price Prediction slide – yes the concept was introduced in sprinkles throughout. Thanks again!

You are welcome @dds!