As a public service announcement and after some research, here are my reviewed diagrams for:
- feed-forward processing in a
- 3-layer neural network (why 3? why not!) doing
- 2-class logistic regression
This includes:
- better arrangement (we now read top to bottom instead left to right on a looong sheet)
- two separate diagrams (on for the simple and one for the vectorized case)
- the correct convention for the W matrix (as used in week 3)
Enjoy.
Here they are:
Simple
Vectorized
2 Likes
Just as a terminology clarification, the correct way to say this would be “binary classification”. Logistic Regression is a very specific algorithm that does binary classification, but it not considered a neural network. As Prof Ng explains in the lectures, you can think of LR as a trivial NN: it is just the output layer of a Neural Network.
1 Like
Fantastic!
I like it going vertically. It fits better to computer screen and I just like it!
I suppose you meant that to be X? (if so, the same for the other diagram)
The way you slice W into neurons is very intuitive! And the same goes for slicing X into samples.
Reading through the diagram, I can see how you understood it.

1 Like
Hi Raymond, yes it is “X”, same as A^{[0]}, I will make this clear.
Also, a new dataflow diagram for backpropagation will drop soon…
The previous one was too complex, and cannot directly be mapped to the course. BAD!
But the need demands what Prf. Ng calls “the intuition of backpropagation”. Frankly, that would have been such a great exercise in high school.
But first, I need to raise a couple of problems with the programming exercise of Week 3. 
1 Like
Great! David, we will be here to discuss with you!
Cheers,
Raymond