Paper: "Activation Functions in Artificial Neural Networks: A Systematic Overview"

If anyone is ever looking for a systematic review of popular activation functions, Johannes Lederer has published a survey.

Activation functions shape the outputs of artificial neurons and, therefore, are integral parts of neural networks in general and deep learning in particular. Some activation functions, such as logistic and relu, have been used for many decades. But with deep learning becoming a mainstream research topic, new activation functions have mushroomed, leading to confusion in both theory and practice. This paper provides an analytic yet up-to-date overview of popular activation functions and their properties, which makes it a timely resource for anyone who studies or applies neural networks.

Published 2021-01-25

The survey paper includes the following. The tensorflow/keras documentation on activation functions has its own descriptions, which I add:

  • Sigmoid Functions
  • Piecewise-Linear Functions
  • Other Functions
    • softplus
    • elu and selu
    • swish
    • Learning Activation Functions

As a supplement, here is a paper about searching for “good” activation functions using “genetic algorithm” search metaheuristic:

1 Like

@dtonhofer Good Articles. :ok_hand:

1 Like