Exercise 5 - L_model_forward

Hi all,
I have been checking manually the dimensions and it sounds right to me, yet I am getting the below error: also how can I check each vector shape? I mean where would I put the code?

ValueError                                Traceback (most recent call last)
<ipython-input-159-10fc901e800a> in <module>
      1 t_X, t_parameters = L_model_forward_test_case_2hidden()
----> 2 t_AL, t_caches = L_model_forward(t_X, t_parameters)
      3 
      4 print("AL = " + str(t_AL))
      5 

<ipython-input-158-cfe68e945af9> in L_model_forward(X, parameters)
     28         # YOUR CODE STARTS HERE
     29 
---> 30         A, cache = linear_activation_forward(X, parameters["W"+str(l)], parameters["b"+str(l)], activation = "relu")
     31 
     32         caches.append(cache)

<ipython-input-125-1980cedde44f> in linear_activation_forward(A_prev, W, b, activation)
     35         # YOUR CODE STARTS HERE
     36 
---> 37         Z, linear_cache = linear_forward(A_prev, W, b)
     38 
     39         A, activation_cache = relu(Z)

<ipython-input-123-0e4ecaaa6191> in linear_forward(A, W, b)
     19     # YOUR CODE STARTS HERE
     20 
---> 21     Z= np.dot(W, A)+ b
     22 
     23 

<__array_function__ internals> in dot(*args, **kwargs)

ValueError: shapes (3,4) and (5,4) not aligned: 4 (dim 1) != 5 (dim 0)

Here is a thread which gives the full “dimensional analysis” on this test case. Have a careful look at that and it should give you some direction for where to look for your error.

For example, by comparing to the dimensions stated on that thread, it looks like the W value in your case is W2 which is 3 x 4, but the dimension of the A1 input is wrong. It’s the same as the dimension of X (5 x 4), but it should be 4 x 4. So how could that happen?

Hi @paulinpaloalto
I dont know how that happened? I passed all the previous exercises as well.
How can I check the dimensions for each? I am using W.shape but nothing is shown?
HEre is my code in this exercise:

{moderator edit - solution code removed}

You did actually read the code, right? You are always passing X as the first argument to linear_activation_forward. That is wrong and is what causes this error.

Note that once you fix that, you’ll then get a shape mismatch on the output layer, because that logic is also wrong. What is the value of A_prev when you fall out of the “for” loop?

In terms of how to do dimensional analysis, it’s not just the W shapes, right? It’s the shapes of the A values at each level that are the real point. Put a print statement to show the loop number and the shape of A in your loop and when you fall out of the loop.