Week 4: building blocks, cache for dw, db

Hi all,

My question is about DLS1 week 4, lecture called " Building Blocks of Deep Neural Networks".
In the slide here below, I understand we use cache to store values of dW and db during backpropagation. Those values would then be used to update the parameters W and b with the learning rate.
I am not sure when we are supposed to update W and b: in the next step of the back propagation? in the same step of the backpropagation? or in the next iteration?

Thank you for clarifying.

Best regards,

Desiree

You update the parameters in each iteration of back propagation. So you don’t need to save the dW^{[l]} and db^{[l]} values. They get used in “real time”.

All this will become really clear when you get to the point of actually writing the code in the “Step by Step” assignment in Week 4.

Thank you Pau for replying to all of my questions!