W4_A1_UnboundError_variable A referenced before assignment


UnboundLocalError Traceback (most recent call last)
in
----> 1 parameters, costs = two_layer_model(train_x, train_y, layers_dims = (n_x, n_h, n_y), num_iterations = 2, print_cost=False)
2
3 print("Cost after first iteration: " + str(costs[0]))
4
5 two_layer_model_test(two_layer_model)

in two_layer_model(X, Y, layers_dims, learning_rate, num_iterations, print_cost)
48 # YOUR CODE STARTS HERE
49
—> 50 A1, cache1 = linear_activation_forward(X, W1, b1, activation = relu)
51 A2, cache2 = linear_activation_forward(A1, W2, b2, activation = sigmoid)
52

~/work/release/W4A2/dnn_app_utils_v3.py in linear_activation_forward(A_prev, W, b, activation)
209 A, activation_cache = relu(Z)
210
→ 211 assert (A.shape == (W.shape[0], A_prev.shape[1]))
212 cache = (linear_cache, activation_cache)
213

UnboundLocalError: local variable ‘A’ referenced before assignment

I think “linear_activation_forward” have some error in this assignment.
Could you help me solve this problem?
I tried several things… but I couldn’t solve this error.

Hello @yusinjeong5859:slight_smile:

This error is very tricky. I followed the error traceback to the definition of linear_activation_forward which is inside dnn_app_utils_v3.py (avaiable to you as well). If we look at its code:

image

then you can see that the function expected either "sigmoid" or "relu" as the value for activiation. If we have provided anything other than those, then A will never be assigned any value. Consequently, as it ran assert (A.shape == (W.shape[0], A_prev.shape[1])), we would have the local variable ‘A’ referenced before assignment error.

My suggestion is for you to go through the error traceback, check what you had supplied to linear_activation_forward for the activation parameter, and see if you can fix it.

Cheers,
Raymond

Hi,
I used the below statement while defining the activation function parameter in linear_activation_forward and linear_activation_backward utility funtion. Please use activation = ‘relu’ or ‘sigmoid’ while passing the parameter. it will resolve the issue.
e.g. A1, cache1 = linear_activation_forward(X, W1, b1, activation = “relu”)
or dA1, dW2, db2 = linear_activation_backward(dA2, cache2, activation = ‘sigmoid’)

It would be better to share your full error, rather than your code.

Hi,
The full error is mentioned in the message by yusinjeong5859 at the very beginning of the thread.

Hello, PreetamKumar.

Paul sir has already explained that in his reply.

the function expected either "sigmoid" or "relu" as the value for activiation . If we have provided anything other than those, then A will never be assigned any value. Consequently, as it ran assert (A.shape == (W.shape[0], A_prev.shape[1])) , we would have the local variable ‘A’ referenced before assignment error.

Please look at the activation function carefully, you are providing to the case.