W4_A1_L_model_forward_Test_case

Hello everyone,
I am running the L_model_forward and I got this error message
UnboundLocalError Traceback (most recent call last)
in
1 t_X, t_parameters = L_model_forward_test_case_2hidden()
----> 2 t_AL, t_caches = L_model_forward(t_X, t_parameters)
3
4 print("AL = " + str(t_AL))
5

in L_model_forward(X, parameters)
27 # caches …
28 # YOUR CODE STARTS HERE
—> 29 A, cache = linear_activation_forward(A_prev, parameters[“W” + str(l)], parameters[“b” + str(l)], relu)
30 caches.append(cache)
31 # YOUR CODE ENDS HERE

in linear_activation_forward(A_prev, W, b, activation)
34 A, activation_cache = relu(Z)
35 # YOUR CODE ENDS HERE
—> 36 cache = (linear_cache, activation_cache)
37
38 return A, cache

UnboundLocalError: local variable ‘linear_cache’ referenced before assignment

Need your help !!!

What this error mean is variable linear_cache inside the cache = (linear_cache, activation_cache) is used before its defined. Go back to that that function and see if thats the case then you need to define the variable linear_cache before you can use it as a parameter in the function.

I don’t really understand, actually linear_cache is a return of a function that I use in this function, and cache = (linear_cache, activation_cache) is just a way to store this return to build the return of my actual function,
I don’t really know how I should define linear_cache :thinking:
what I did in my actual function is something like this:
def actual_function(x1, x2, … , xn):

   linear_cache = past_function1(x1, x2, .. , xn)
   activation_cache = past_function2(x1, x2, .. , xn)

   cache = (linear_cache, activation_cache)

return (cache)

Maybe you try defining the linear_cache outside the function.

Ok I will restart the kernel and rerun the cells

Okay I will try it to define it outside the function

Hi,
I still having the same problem in the assignment 2,
I’m running the function

parameters, costs = two_layer_model(train_x, train_y, layers_dims = (n_x, n_h, n_y), num_iterations = 2, print_cost=False)

and I have this error

UnboundLocalError Traceback (most recent call last)
in
----> 1 parameters, costs = two_layer_model(train_x, train_y, layers_dims = (n_x, n_h, n_y), num_iterations = 2, print_cost=False)
2
3 print("Cost after first iteration: " + str(costs[0]))
4
5 two_layer_model_test(two_layer_model)

in two_layer_model(X, Y, layers_dims, learning_rate, num_iterations, print_cost)
46 # A2, cache2 = …
47 # YOUR CODE STARTS HERE
—> 48 A1, cache1 = linear_activation_forward(X, W1, b1, relu)
49 A2, cache2 = linear_activation_forward(A1, W2, b2, sigmoid)
50 # YOUR CODE ENDS HERE

~/work/release/W4A2/dnn_app_utils_v3.py in linear_activation_forward(A_prev, W, b, activation)
209 A, activation_cache = relu(Z)
210
→ 211 assert (A.shape == (W.shape[0], A_prev.shape[1]))
212 cache = (linear_cache, activation_cache)
213

UnboundLocalError: local variable ‘A’ referenced before assignment

Look at the logic in linear_activation_forward: how can you get to the assert statement with A being undefined? It means you did not take either of the logical branches, right? How could that happen? Because the value of the activation parameter you passed from the higher level does not match either of the choices. It’s the same as the error you quoted earlier. Hint: in python relu and "relu" are not the same thing. The former is an object reference to a function and the latter is the string name of the function. What is the logic comparing against?

1 Like

Thank you very much, when I release my error I feel so stupid :man_facepalming:

Hi, Victor.

If it’s any comfort, please note that the reason I was able to point out that error is that I’ve seen it a lot of times before. You are far from the first DLS student to step on that landmine. Maybe the bigger “meta” point here is the reasoning I described for how to track backwards to the actual error. You always have to start with the line that “throws” and then work your way backwards. It is frequently the case that a perfectly correct function can throw an error, because it was passed incorrect parameters. But the way to figure it out is to understand what is happening at the point of the error first. Then where does that lead?

The error came from the fact that when I call linear_activation_forward function, for the activation parameter I put relu or sigmoid instead of “relu” or “sigmoid”