W4_A1_Ex-8_Error in relu_backward function

TypeError Traceback (most recent call last)
in
1 t_dAL, t_linear_activation_cache = linear_activation_backward_test_case()
2
----> 3 t_dA_prev, t_dW, t_db = linear_activation_backward(t_dAL, t_linear_activation_cache, activation = “sigmoid”)
4 print("With sigmoid: dA_prev = " + str(t_dA_prev))
5 print("With sigmoid: dW = " + str(t_dW))

in linear_activation_backward(dA, cache, activation)
31 # dA_prev, dW, db = …
32 # YOUR CODE STARTS HERE
—> 33 dZ = relu_backward(dA, linear_cache)
34 dA_prev, dW, db = linear_backward(dZ, linear_cache)
35 # YOUR CODE ENDS HERE

~/work/release/W4A1/dnn_utils.py in relu_backward(dA, cache)
54
55 # When z <= 0, you should set dz to 0 as well.
—> 56 dZ[Z <= 0] = 0
57
58 assert (dZ.shape == Z.shape)

TypeError: ‘<=’ not supported between instances of ‘tuple’ and ‘int’

can anyone point out what I am doing wrong?

Hello,

OK. Do you know what is activation_cache and linear_cache? What store in it? And, which cache do we need to pass for a relu_backward?

Hint is given to you. Kindly read all the text under the title 6.2 - Linear-Activation Backward.

Best,
Saif.

1 Like

well thank you for taking time to answer & helping me out to both the mentors. & the issue has been resolved.

I am glad your issue has been resolved.

1 Like