Week2 ReLU activation optional lab

explain the week2 optinal lab in brief. Is there any base to adjust the values of parameters w and b . I found it difficult to understand

Hi @Srilekha1

The ReLU function is defined as: For x > 0 the output is x, i.e. f(x) = max(0,x)

So for the derivative f '(x) it’s actually:

if x < 0, output is 0. if x > 0, output is 1.

The derivative f '(0) is not defined. So it’s usually set to 0 or you modify the activation function to be f(x) = max(e,x) for a small e(called leaky relu).
These calculate the \frac{\partial l}{\partial a}
BUT

to update the weights we using chain rule to update the parameters like this image
image
using optimization algorithm