# W4_A1_Ex-9_L_model_backward

Hello! I hope you are doing well.

I read all the posts on this topic (Exercise 9 - L_model_backward) but didn’t get my answer.
A hint to solve this Ex is given in the attached file but I am not able to understand it.

1. First, what is the current cache?

2. I used linear_activation_backward function to determine dA_prev_temp, dW_temp, db_temp.

3. As given that grads[“dW” + str(l)] = dW^{[l]}] therefore, for sigmoid, I used:

``````Code removed
``````

and for relu, I used:

``````   code removed
``````

Where am I making mistakes? Kindly guide me, I will be very thankful to you.

PS: After solving this, I will remove my code.

Regards,
Saif Ur Rehman.

First lets review the flow of this function: In the L_model_backward you will go from layer L to layer 1, and for each you will calculate the backprop value. So on L layers, L-1 is your starting point, array-wise.

Second, lets keep in mind that "caches – list of caches containing:… " so we have a list of all the saved caches here.

For your #1: What is the current_cache? Current cache would be the cache-th entry in caches for each layer you are processing, starting with L-1-th layer and reverse until 0.

For your #2: You used linear_activation_backward to determine some values that, by the way, I don’t see you using moving forward. Make sure you call linear_activation_backward using the right dA and the right activation. The instructions give clear hints about this.

For your #3: Here’s where you may want to review the values you use to update the grads: as mentioned earlier, I don’t see that you are using the values returned by linear_activation_backwards.

Hope these hints help.

Juan

PS: IMPORTANT: Please remove your code as it goes against the Honor Code.

1 Like

Thank you so much sir @Juan_Olano. I understood and cleared the assignment.

Yes, I removed it.