W4 - Programming assignment 1

Hello

In the linear_activation_backward function, I have to calculate the dA_prev, dW, and db variables for calculating them W and A variables are needed which aren’t available, what should I do?

Best,
Mansoor

Hi Mansoor,

Are you referring to Exercise 8? If so, check out the hints before the problem (6.2 Linear Activation Backward) you will see some pre-implemented functions already imported in the assignment there. Hope it helps!

Best,
Kezhen

That is the purpose of the cache variable that is passed in. Have a closer look at how those are constructed and what they contain.

Thanks Kchong for your response,

The pre-implemented function needs a cache variable that is different from the cache variable which is sent to this function

Thank you Paulinpaloalto for your response,

The cache variable for this function contains, linear_cache and activation_cache variables although it should contain A_prev, W, b variables

Well, that’s the first level of analysis. Now you need to look at what is contained in linear_cache and activation_cache.

in linear_cache is the Z variable and in the activation_cache is the A variable

You are not interpreting the code correctly. The activation cache contains Z and the linear cache is a 3-tuple that contains (A, W, b). Look at where those cache values are generated: by linear_forward and by sigmoid and relu. Then linear_activation_forward concatenates them to make the full cache entry for each layer. The code is all there for you to see. You can find the code for sigmoid and relu by clicking “File → Open” and then opening the utility python file.

1 Like