W1 order of steps in logistic regression

I didn’t notice this during the lecture, but this question on the quiz gave me a pause. Doesn’t this order of operations perform one extra update on the weights, which could take the weights out of the local minimum? Or does the “get loss” step involve making an additional prediction using updated weights?

I don’t understand your point. There is only one occurrence of “update” in that sequence. But note that you’re not at a local minimum: that’s the goal and this is talking about how you “get there”.

But this is all sort of “angels dancing on the head of a pin” stuff. The algorithms are the algorithms. I think what they should say is:

Initialize parameters
while (not good enough)
   forward propagation (which includes both prediction and cost/loss calculation)_
   backward propagation (which computes the gradients)
   update parameters

In all the 4 choices they give, how do you know what is covered by “repeat”? You clearly don’t want to repeat the initialization of the parameters, but that’s a bit ambiguous in the way they stated it.