Is rectified in relu derived from rectifier?

In physics, rectifier is a circuit that convert ac to dc but simply cutting off the troughs of the ac signal. now in relu -ve values of linear function is set to zero which make me think that is this taken from rectifier circuit?

Not a ML related question i know, but i am curious

What does this line mean related to ReLU

This enables multiple units to contribute to to the resulting function without interfering.

@tbhaxor

When a neural network has multiple neurons with ReLU activation function, it becomes possible to piece together the contribution from each neuron by controlling when each neuron turns ON or OFF. In ON state a particular neuron will contribute towards the output, in OFF state it will not contribute or interfere with the outputs coming from other neurons that are in ON state.

Yes this makes sense. From interfering mathematically it means that product of the unit output with weight is != 0.

From “piece together

what operation you mean, adding all , multiplying all or what?

The ReLU function has an output only when the input to it is > 0. So if w.x + b of a particular neuron is < 0, the output of that neuron = 0 and hence it does not make an active contribution. However, there could be other neurons having their resepective w.x + b > 0, and hence making an active contribution.

image

We have a ReLU lab specifically devoted to the understanding of how and when the contributions from the various neurons piece together the final output

1 Like