Is this activation function always used for every regression problem?

Greetings fellow learners, it is requested to answer whether or not linear activation function is used for every regression problem?


The ReLU activation function as discussed in the video can also be used for Regression.

TensorFlow also has a few different types of activation functions for both classification and regression if you wish to look them up.

Hey @Talha1234,

If you are trying to ask whether or not we can use linear activation function for every layer of a neural network developed for a regression problem, then the answer is YES, YOU CAN! But does that mean YOU SHOULD? Not really, because, the performance would be mostly poor, due to the many reasons discussed in the lecture videos and will be discussed in the upcoming lecture videos.

As Sam pointed it out, there are a ton of activation functions that you can use, and Tensorflow implements many of them, and some of these activation functions have huge advantages over the linear activation, due to which they are often used more frequently as compared to the linear activation function. I hope this helps.