Week - 2 Logistic Regression Vectorization steps

Just to understand what we did to vectorize Logistic Regression:

Step 1: Perform Forward Propagation to compute Z and then A.
Step 2: Perform backwards propagation to compute da, dz, dw and db
Step 3: Vectorize dz, dw and db —> dZ, dW and dB
Step 4: update w and b

Each iteration performs step 1 to 4 for each m example? I got the general idea, but it’s a bit fuzzy to me.

I think you have the steps right, but the whole point of vectorization is that we perform each of those steps for all m samples at once, right? E.g. in Step 1, the formulas are:

Z = W \cdot X + b
A = sigmoid(Z)

Both of those equations handle all m samples at the same time. It works the same way in back propagation.