Backward propagation equations for different activation functions

Hi there

I’m new to the forum so please forgive me if I missed a thread with the answer to this question. Does anyone know where to find the derivation of backward propagation equations when using activation functions other than the sigmoid function (e.g. ReLU, tanh, …)?

Thanks in advance
Antonio

Check this thread for tanh.

For ReLU, I cannot find any mathematical explanation on this forum but its simply 1 if x>0, otherwise 0.