Course 2 -- Week 1 -- Numerical approximation of gradients

Hello, I have a question about numerical approximation. What happens if we have ReLu function which is non-differentiable at 0? Could we still use the two-side difference to calculate the derivative since the cost function is non-differentiable at some points? Thank you.


Right, for our purposes the non-differentiability of ReLU at 0 doesn’t really cause any problem. You can use either 0 or 1 as the value of the derivative at 0 and everything just works. Also ReLU does have a value at 0, right? So the “finite difference” approximation of the derivative works with no problems.