W4_A1_Ex-5_Test Fails for L_model_Forward

Hi,

I have got the following error while running exercise 5, but it is to be noted that such an error did not rise while running the original section where we compute the dot product between A and W.
p.s: initial A inside linear_activation_forward function has been corrected to A_prev , I still get the same error

It looks like the mistake is that you are hard-coding the logic to always use W1 and b1 when you call linear_activation_forward. That only works if you have a network with one hidden layer. We are trying to write general code here that can handle any number of layers. And the specific test case here as 2 hidden layers. You can see how to generalize this by looking at the logic in initialize_parameters_deep.

It is a general principle of debugging that just because the error is thrown in a particular function, that does not mean that is where the bug actually is. It may be that (as in this case) you passed incorrect arguments to a correct function. So you need to track back up the call stack to see where the mistake actually is. But I think I’ve given you a pretty big clue in the previous paragraph.

1 Like

Thank you Paul , like you said I was hardcoding W1 and b1

Hello! I experienced a similar error to @user492 but in my case, it was an assertion error and I doubt if I hardcoded W1 and B1 when calling the function, instead I generalized across the layers. Any ideas on how I might fix this?

Your outputs all have the wrong shape. Can you show us the full output you get, not just the errors?

Have you been through the dimensional analysis, as described on this thread. You might add print statements to show the shape of A at each layer. Also note that the test case checks the cache outputs, as well, so those could be wrong even if the A values are correct. You can examine the test case by clicking “File → Open” and reading the file public_tests.py to see all that it is checking.

Note that there are quite a few “moving parts” here, so there are lots of possible ways to get this wrong. Nobody said that debugging is easy, but it’s part of the job, right?

1 Like

Good day. I haven’t been active recently but I am now and here is the full screen shot of the output I got.
I will doing just what you said.
Yes Debugging is part of the Job.

The other thing to pay attention to is that the test case is checking more than just the A^{[l]} values: it also checks the caches. You can see what the test case is doing by clicking “File → Open” and then looking at the file public_tests.py.

Thank you so much @paulinpaloalto .Turns out that all my code was right except for the fact that I used “A_prev” instead of “A” while calling the “linear_activation_forward” function for sigmoid. Dimensional analysis was the key in making this work. Thank you once again for you constant help and assistance. I am so happy :innocent: :blush:

It’s great news that you were able to find the issue by using dimensional analysis. Thanks for confirming! Onward! :nerd_face:

1 Like

Thank you once again @paulinpaloalto for your constant help and guidance. Finally finished the first specialization today.
Best regards!