Week 4 Step by step L_model_backward

One of my grads calculations is choking - please advise where to look. Thanks!

t_AL, t_Y_assess, t_caches = L_model_backward_test_case()
grads = L_model_backward(t_AL, t_Y_assess, t_caches)

print("dA0 = " + str(grads[‘dA0’]))
print("dA1 = " + str(grads[‘dA1’]))
print("dW1 = " + str(grads[‘dW1’]))
print("dW2 = " + str(grads[‘dW2’]))
print("db1 = " + str(grads[‘db1’]))
print("db2 = " + str(grads[‘db2’]))

L_model_backward_test(L_model_backward)

len:(cache): 2

Cache is: ((array([[ 1.97611078, -1.24412333],
[-0.62641691, -0.80376609],
[-2.41908317, -0.92379202]]), array([[-1.02387576, 1.12397796, -0.13191423]]), array([[-1.62328545]])), array([[ 0.64667545, -0.35627076]]))


TypeError Traceback (most recent call last)
TypeError: float() argument must be a string or a number, not ‘tuple’

The above exception was the direct cause of the following exception:

ValueError Traceback (most recent call last)
in
1 t_AL, t_Y_assess, t_caches = L_model_backward_test_case()
----> 2 grads = L_model_backward(t_AL, t_Y_assess, t_caches)
3
4 print("dA0 = " + str(grads[‘dA0’]))
5 print("dA1 = " + str(grads[‘dA1’]))

in L_model_backward(AL, Y, caches)
50 dA_prev_temp, dW_temp, db_temp = linear_activation_backward(dAL, current_cache,“sigmoid”)
51
—> 52 grads[“dA” + str(L-1)] = np.dot(dW_temp.T, current_cache)*(1-np.power(AL ,2))
53 grads[“dW” + str(L)] = (1/m) * np.dot(current_cache,AL.T)
54 grads[“db” + str(L)] = (1/m)*np.sum(current_cache,axis = 1, keepdims = True)

<array_function internals> in dot(*args, **kwargs)

ValueError: setting an array element with a sequence.

Hey @Brendon_Wolff-Piggot,
Please take a close look at your expressions of grads[“dA” + str(L-1)], grads[“dW” + str(L)] and grads[“db” + str(L)]. Take a close look at what the linear_activation_backward function returns, and how you have to use them. As of now, you are incorrectly using them, and that’s why you are getting this error. Let me know if this helps, otherwise I will help you out with the code.

Cheers,
Elemento

1 Like