W2_A1_Ex-9_problem in finding First argument for linear_activation_backward() when activation is 'relu'

[code removed - moderator]

I am cofused and not understanding, what exactly should be the first argument for the function linear_activation_backward() when we want to find the values of dA_prev_temp, dW_temp, db_temp for multiple layers (activation function is ‘relu’). Here as the for loop is going in reversed direction so i guess for lth layer it should be dA[l-1] but it is giving me error, so i kept dA[l]. (so that l=L-1 in the first iteration which means it takes dA[L-1] computed from the last-layer where sigmoid activation is applied).Please correct me where i am wrong.

Use post-activation gradient when going in the backward direction. As an example, use dA at index 4 when performing backward pass for layer at index 3.

Keep in mind that range class in python genrated values in [start_value, end_value)

@balaji.ambresh As much as i got, in backward propagation for nth layer we need (n+1)th A. I want to know this (n+1)th A belongs to which layer ?

Please read the markdown and code comments. Your question is answered in the notebook.