Optional ReLU lab

The non-linear activation function is responsible for disabling the input prior to and sometimes after the transition points.

What does this sentence mean?

Hello @ajaykumar3456,

Here is my understanding:

“disabling” means when the ReLU outputs zero.

“transition point” means the turning point that ReLU starts to output non-zero value, which means ReLU(0).

“prior to and sometimes after the transition points”: if our input feature is x, and we have w=1, then we have ReLU(x) so that it disables any input “to the left” of the transition point, or x < 0. If we have w=-1, then we have ReLU(-x) which will diable any input “to the right” of the transition point, or x>0.

Does this make sense to you?

Raymond

Hello @ajaykumar3456

There is also a lab dedicated to ReLU in Course 2 Week 2 - “ReLU activation”, which showcases how a particular neuron remains inactive (off) and then turns active (linear). And using multiple such neurons in their inactive (off) and active (linear) state, we are able to get the desired output function.

Makes Complete sense. Thanks Raymond

Yeah. I worked on that lab and got this query. Now, I’m good. Thanks @shanup