Problem in calling sigmoid_backward in linear_activation_backward

Hello. In the Building_your_Deep_Neural_Network_Step_by_Step assignmentm, I am stuck in the “linear_activation_backward” stage.
The sigmoid_backward function expects a cache as its second input, and within the function it puts this cache to Z.
Though, the cache is a tupple containing A,W, and b.
How does this work?
Thx, Gilad

Could you please share the full error?

2 Likes

OK.
The function is:

# GRADED FUNCTION: linear_activation_backward

# mentor edit: code removed - not allowed by Code of Conduct

The call to the function:
t_dAL, t_linear_activation_cache = linear_activation_backward_test_case()

t_dA_prev, t_dW, t_db = linear_activation_backward(t_dAL, t_linear_activation_cache, activation = “sigmoid”)

print("With sigmoid: dA_prev = " + str(t_dA_prev))

print("With sigmoid: dW = " + str(t_dW))

print("With sigmoid: db = " + str(t_db))

t_dA_prev, t_dW, t_db = linear_activation_backward(t_dAL, t_linear_activation_cache, activation = “relu”)

print("With relu: dA_prev = " + str(t_dA_prev))

print("With relu: dW = " + str(t_dW))

print("With relu: db = " + str(t_db))

linear_activation_backward_test(linear_activation_backward)

The error:
TypeError Traceback (most recent call last) Cell In[48], line 4 2 cache = t_linear_activation_cache 3 cache[0][2] ----> 4 dZ = relu_backward(dA,cache) File [g:\My](file:///G:/My) Drive\personal\my courses\coursera\deep learning\W4\W4A1\dnn_utils.py:56, in relu_backward(dA, cache) [53](file:///G:/My%20Drive/personal/my%20courses/coursera/deep%20learning/W4/W4A1/dnn_utils.py:53) dZ = np.array(dA, copy=True) # just converting dz to a correct object. [55](file:///G:/My%20Drive/personal/my%20courses/coursera/deep%20learning/W4/W4A1/dnn_utils.py:55) # When z <= 0, you should set dz to 0 as well. —> [56](file:///G:/My%20Drive/personal/my%20courses/coursera/deep%20learning/W4/W4A1/dnn_utils.py:56) dZ[Z <= 0] = 0 [58](file:///G:/My%20Drive/personal/my%20courses/coursera/deep%20learning/W4/W4A1/dnn_utils.py:58) assert (dZ.shape == Z.shape) [60](file:///G:/My%20Drive/personal/my%20courses/coursera/deep%20learning/W4/W4A1/dnn_utils.py:60) return dZ TypeError: ‘<=’ not supported between instances of ‘tuple’ and ‘int’

Hi @gilad.danini ,

Please check the argument that is passed to the calling function. In your code, you passed the argument ‘cache’ to both the activation part as well as the linear part.
The variable cache contains both the linear cache and activation cache. So only the correct one should be used.

1 Like

And this should be cache[1]?

Got it right. THX!!!

1 Like