# Numerical output values (not logical values)

Hello everybody

How can I get the real numerical output values (not logical values) in my neural networks model.

I mean when I use activation function, the data will be scaled within this function Y axis, I need the predicted data to be with the same magnitude as the real output.

Which algorithm or techniques may I use?

Not sure if I understand your question since NN output is always numeric.

If you donâ€™t specify an activation function for the output layer, no further transformation is performed to the linear transformation applied to the layer inputs. Does this help?

1 Like

My model is going to predict a numerical quantity between (0 - 3)
there is no threshold for classification, but when I use activation function, the data will be scaled (changed, depending on the activation function like between 0 and 1 or between -1 and +1), and if I donâ€™t use activation function, the model will be considered as linear as I feel.
Ä°s there any way to keep nonlinearity and get non-scaled values?
Does using Relu function in all layers solve this?
I am confused

1 Like

When your problem is a â€śregressionâ€ť problem (meaning the output is a continuous number of some sort) as opposed to a â€śclassificationâ€ť problem, then you have two choices to make:

1. The output activation function
2. The loss function

If you are predicting something like a temperature, a stock price or that type of thing, you are probably fine using ReLU as the output activation or even just use the linear output. Then you probably will want to use a â€śdistanceâ€ť based cost function, e.g. MSE (Mean Squared Error).

2 Likes

thanks Sir,
this is very useful and make sense for me