Hello,

Can someone please explain what np.int64(A2 > 0) does in the following line of code?

dZ2 = np.multiply(dA2, np.int64(A2 > 0))

This is inside the function backward_propagation_with_regularization.

Thank you,

Nay

Hello,

Can someone please explain what np.int64(A2 > 0) does in the following line of code?

dZ2 = np.multiply(dA2, np.int64(A2 > 0))

This is inside the function backward_propagation_with_regularization.

Thank you,

Nay

1 Like

That is just the derivative of the ReLU activation function. Remember that the relevant back propagation formula is:

dZ^{[l]} = dA^{[l]} * g^{[l]'}(Z^{[l]})

The derivative of ReLU is 0 if Z \leq 0 and 1 if Z > 0, right? And it works the same if you use A in place of Z there, since A > 0 if and only if Z > 0.

3 Likes