Week 4 Exercise 5 L_model_forward

I am confused as to put what in the parameters for w,b for the last layer using sigmoid.

I am getting the following error:

ValueError Traceback (most recent call last)
in
1 t_X, t_parameters = L_model_forward_test_case_2hidden()
----> 2 t_AL, t_caches = L_model_forward(t_X, t_parameters)
3
4 print("AL = " + str(t_AL))
5

in L_model_forward(X, parameters)
36 # caches …
37 # YOUR CODE STARTS HERE
—> 38 AL,cache=linear_activation_forward(A_prev, parameters[‘W’ + str(L)], parameters[‘b’+str(L)], activation=“sigmoid”)
39 caches.append(cache)
40 # YOUR CODE ENDS HERE

in linear_activation_forward(A_prev, W, b, activation)
22 # A, activation_cache = …
23 # YOUR CODE STARTS HERE
—> 24 Z,linear_cache=linear_forward(A_prev, W, b)
25 A,activation_cache=sigmoid(Z)
26 # YOUR CODE ENDS HERE

in linear_forward(A, W, b)
18 # Z = …
19 # YOUR CODE STARTS HERE
—> 20 Z=np.dot(W,A)+b
21
22 # YOUR CODE ENDS HERE

<array_function internals> in dot(*args, **kwargs)

ValueError: shapes (1,3) and (4,4) not aligned: 3 (dim 1) != 4 (dim 0)

1 Like

To find the bug, you need to perform dimensional analysis. @paulinpaloalto has already written a great post covering L_model_forward_test_case_2hidden():

3 Likes

just to save others’ time, try this link you will understand how the for i in range(1,L) loop works
it took me more than an hour just because of this number :slight_smile: Python Tryit Editor v1.0

Dear Community,
I am having a caching problem during ### Exercise 5 - L_model_forward, can I get some help?

---------------------------------------------------------------------------
UnboundLocalError                         Traceback (most recent call last)
<ipython-input-33-10fc901e800a> in <module>
      1 t_X, t_parameters = L_model_forward_test_case_2hidden()
----> 2 t_AL, t_caches = L_model_forward(t_X, t_parameters)
      3 
      4 print("AL = " + str(t_AL))
      5 

<ipython-input-32-4f52d0652ddc> in L_model_forward(X, parameters)
     27         # caches ...
     28         # YOUR CODE STARTS HERE
---> 29         A, cache = linear_activation_forward(A_prev, parameters["W" + str(l)], parameters["b" + str(l)], relu)
     30         caches.append(cache)
     31 

<ipython-input-14-86db2fd9a9de> in linear_activation_forward(A_prev, W, b, activation)
     36 
     37         # YOUR CODE ENDS HERE
---> 38     cache = (linear_cache, activation_cache)
     39 
     40     return A, cache

UnboundLocalError: local variable 'linear_cache' referenced before assignment

" Don’t forget to keep track of the caches in the “caches” list. To add a new value c(cache) to a list(caches) , you can use list.append(c)" chaces = .
What can be the error here?

You can figure this out by looking at the logic in linear_activation_forward. If you get to that line and linear_cache has not been given a value, it means you did not take either of the “if” statements, right? So how could that happen? Notice that they are checking for the string name of the activation function. But that is not what you passed: you passed an object reference to the actual function instead. That is why neither of the “if” conditions matched.

1 Like

Thank you for your reply, indeed string vs. object reference solved this previous problem.
Now i have some dimensional issue, but i am 90% confident enough i will be able to handle that one.

For the forward propagation, here’s a thread which walks through the “dimensional analysis” on the “2hidden” test case. That’s worth a look as a framework for debugging dimension issues.

That is a very helpful thread, thank you!
In a very strange way i got the idea of matrix dimension behaviour really quickly :man_shrugging:
Wrote down to a piece of paper the layers, variables and their dimensons, then aligned them together in the correct order until result.
My special power on ‘messing codes up’ is my mistypings…
e.g. np.random.rand() instead of np.random.randn(), “b” instead of “w” and then staring at the code for hours.

2 Likes