Is it possible to see code for sigmoid_backward, and relu_backward function.
1 Like
Yes open File-> Open and see all the saved files in the lab.
Thank You, very much…
I wrote my relu_backward code, and I just want to chack out is it ok?
def relu_backward_m(dA,activation_cache):
Z=activation_cache
dg=np.where(activation_cache<0,0,1)
dZ=np.multiply(dAL,dg)
return dZ
Did you compare with the one on the dnn_utils, it seems to me a bit different. You can test both with a range of values and see if they output the same.
The below is from the dnn_app_utils_v3
file of DLS Course 1, Week 4, Assignment 2.
def relu_backward(dA, cache):
"""
Implement the backward propagation for a single RELU unit.
Arguments:
dA -- post-activation gradient, of any shape
cache -- 'Z' where we store for computing backward propagation efficiently
Returns:
dZ -- Gradient of the cost with respect to Z
"""
Z = cache
dZ = np.array(dA, copy=True) # just converting dz to a correct object.
# When z <= 0, you should set dz to 0 as well.
dZ[Z <= 0] = 0
assert (dZ.shape == Z.shape)
return dZ
Best,
Saif.
Thank You a lot I will do it