In Softmax regression, will the new activation function replace all Of Relu and sigmoids?
Just replace the output layer sigmoid?
Relu reserved?
thanks
In Softmax regression, will the new activation function replace all Of Relu and sigmoids?
Just replace the output layer sigmoid?
Relu reserved?
thanks
The question is vague but if you mean that if you replace a sigmoid classisification by softmax, then yeah the softmax will be apllied to the output.
The output layer is softmax.
But does every neuron in the hidden layer use Relu?
The hiden layers remain unchanged.
Hi, @Fatcar2002!
The activation functions can be chained and they will be computed sequentially in the order you have programmed. Adding the softmax activation at the end will compute the activation function to the output you have so far. This output can be from a layer or from another activation function.
I got it, thank you~~~~~~~~~