Hi All, Could you please help here?
{moderator edit - solution code removed}
As you know, this L_mode_backward() starts with calculating dAL. Then, dAL is back-propagated by linear_activation_backward() as the first step.
What you are working is a loop to reverse back the layers after the first step. So, dAL is already transformed to one of key variables for this iterations.
Please revisit parameters for linear_activation_backward()
Hope this helps.
On thing… Posting your code in here is not recommended. Please remove them, thank you.
I went ahead and edited the post to remove the code.
To state Nobu’s point in a slightly different way: notice that for every iteration of the loop over the hidden layers, you are passing the same value as the dA argument to linear_activation_backward. That basically misses the point of how back propagation works, right?
Thanks for removing the code. I have got the answer and it worked. Thanks Nobu and paulinpaloalto
Hi Again, I’m working on Course 1: Week4: Assignment 2 – implementing two_layer_model. I’m getting below error.
Kindly can you help on this?
Are you sure that you followed the instructions to the letter? E.g. that you did not use the “deep” version of the init routine in the two layer case?
Thanks paulinpaloalto. It worked.
There is no real reason why it is logically incorrect to use the more flexible “deep” init routine, but it just turns out they used different logic in that routine. The result is that you get a different answer and it fails the grader. You can examine the two functions by clicking “File → Open”. You’ll notice that the way they built the “deep” routine uses a more sophisticated initialization method that we will learn about in Course 2. They didn’t bother to explain that, because it’s a more advanced topic and we’re just trying to get the foundations built here in Course 1.