The notes in the week 1’s first programming assignment state that initializing the weights to zero prevents the neural net from breaking symmetry. It thus becomes like a logistic regression. When the code is run, the loss does not decrease. So logistic regression loss will not decrease if weights are initialized to zero.
However, in course 1 we initialize our weights in logistic regression to zero, and we get a decreasing loss.
Why is this?