Week 4 Part 1 Exercise 8

Relevant piece of code:
dZ = sigmoid_backward(dA, activation_cache)
dA_prev, dW, db = linear_backward(dZ, cache)

dZ = relu_backward(dA, activation_cache)
dA_prev, dW, db = linear_backward(dZ, cache)

When running the code, I receive the following error:

ValueError: not enough values to unpack (expected 3, got 2)

I don’t know what I’m doing wrong. Thanks in advance for the help.

Hi, that error seems to indicate that the function linear_backward is returning just two values when you are expecting to receive 3. It is a bit odd error as in exercise 7 you are also using the function:

t_dA_prev, t_dW, t_db = linear_backward(t_dZ, t_linear_cache)

Is that exercise passing all tests? If that’s the case, could you please post the full trace of the error? (not your code, the output error).

ValueError Traceback (most recent call last)
1 t_dAL, t_linear_activation_cache = linear_activation_backward_test_case()
----> 3 t_dA_prev, t_dW, t_db = linear_activation_backward(t_dAL, t_linear_activation_cache, activation = “sigmoid”)
4 print("With sigmoid: dA_prev = " + str(t_dA_prev))
5 print("With sigmoid: dW = " + str(t_dW))

in linear_activation_backward(dA, cache, activation)
36 dZ = relu_backward(dA, activation_cache)
—> 37 dA_prev, dW, db = linear_backward(dZ, cache)

in linear_backward(dZ, cache)
14 db – Gradient of the cost with respect to b (current layer l), same shape as b
15 “”"
—> 16 A_prev, W, b = cache
17 m = A_prev.shape[1]

ValueError: not enough values to unpack (expected 3, got 2)

Sure. I guess that I’m using interchangeably “cache” in relu_backward and in linear_backward when they are not. One of cache has 3 components and the other 2. However, I don’t know how .to feed the linear_backward function (can’t find the corresponding cache in the exercise).

1 Like

I’ve already figured it out. Thank anyway.

I have the same error and my exercises 7 and 8 passed all tests.

Hi, I got the exact error message for exercise 8 and I have passed all test in exercise 7. May I know how I could fix it? thanks

I had the same too.
You have to pass " linear_cache " as a parameter.
Good luck :relaxed:

1 Like