Why do we need Activation function

Hi @Med-akraou

The Relu Acitvation function isn’t linear functions, it’s simple non-linear functions, and if we assume that all the features , and intial values of weights are positive:

  • The optimization algorithm will adjust and tune these values to be more closer to the output
  • we didn’t built the linear regression we built abig complex model to fit complex data like images , sounds with more than 1 hidden layer so the combination of these layer must lead to negative values and the output of relu function will be equal 0

Why we use the simple non-linear activation function this was discussed in this thread by Mentor @paulinpaloalto

Cheers,
Abdelrahman