# Week4, Programming Assignment 1 of 2, Exercise 5: L-model forward

Hello dear all,

I am having an issue with my forward propagation for this assignment:

Context

• In the for-loop I am using Relu to calculate the activation.
• I am using the information in the parameters to get the values of W and B.
• I retrieve the values by using the correct indexes (they seem to work)
• outside of the for-loop, I am invoking the sigmoid function.
• same as in the loop, I retrieve W and B from the parameters. A is coming from the for-loop
• As there is no much that could be tweaked, I am at a loss.

Problem

• An assertion fails in a helper function due to wrong dimensions (see below)
• However, that other function did pass all assertions in â€śExercise 3 - linear_forwardâ€ť. But now is precisely there that the grader is stopping. So I am stuck for a while now.
``````<ipython-input-165-c05f92c03a0d> in linear_forward(A, W, b)
19     # YOUR CODE STARTS HERE
20
---> 21     Z = np.dot(W,A)+b
22
23     # YOUR CODE ENDS HERE

<__array_function__ internals> in dot(*args, **kwargs)

ValueError: shapes (3,4) and (3,4) not aligned: 4 (dim 1) != 3 (dim 0)
``````

I found a slightly similar post but I decided to create a separate one as the other one referred to a different problem in the same area of the code. I hope this is fine.

Thank you for any pointers.
Jonathan

Hi @Jonathanlugo. Itâ€™s often the case that when leaners are invoking their previously implemented â€śhelper functionsâ€ť, they will use â€śhard-codedâ€ť function argumentsâ€“usually the result of a cut-and-paste error. Instead one must apply the arguments defined in the â€śsignatureâ€ť of the function that they are working on. For example, `L_model_forward(X, parameters)` has `X` and `parameters` as arguments.

As it seems you are well aware, that function will require a call to a helper function to complete the `A, cache = ...` and `AL, cache = ...` lines. That helper function will require calls others that you helped to previously implement. From your traceback, you can see where the source of the problem lies.

So, first check that all tests of previous functions passed their test with the expected outputs. Second, make sure that their are no hard-coding errors along the way. If you can clear both of these hurdles, then the problem lies in the implementation of `L_model_forward`.

2 Likes

Just to give a higher level summary of the excellent advice from @kenb: it is a general principle of debugging that just because the error is thrown in a particular function, that does not mean that is where the actual bug is. Your `linear_forward` routine is probably correct, since it passed the previous test cases. So the problem is that you passed incorrect (mismatching) values for A and W down to that routine. So where is the call site? How could things go wrong? Note that it always helps in a case like this to first do the â€śdimensional analysisâ€ť, so that you know what should be happening at each layer. Hereâ€™s a thread which covers that for the â€ś2hiddenâ€ť test case if thatâ€™s the one you are talking about (you donâ€™t really say).

1 Like

Thank you very much @kenb and @paulinpaloalto, your pointers - and particularly the â€śdimensional analysisâ€ť - helped me fix the issue. Not only I understood this section and its concepts better, but I also know now how I can debug my own code more efficiently.

Thank you both!
Jonathan

1 Like