ReLU activation function

One additional thing, a beautiful thing about neural networks, is that even if you have some “dead neurons” the rest of the network can “compensate” them.
In fact, in Course 2, you’ll learn a technique called Dropout to reduce the Variance of a NN (the difference between the error obtained in the training/dev set and the error obtained in the test set) that randomly “kills” some neurons (although, just temporally).
So, don’t worry too much about dead neurons just yet, there are a much more concepts to learn first :wink:

1 Like