Doubt??

Can we apply feature scaling on target variable (y) in the training dataset ?

if so what will be its effect ?

Generally that’s not done. It usually doesn’t provide any benefit.

In some cases, you might need to scale target variable (y) if has a very large spread of values, as it may result in large error gradient values causing weight values to change dramatically, making the learning process unstable. in the following reference, the model weights exploded during training given the very large errors and, in turn, error gradients calculated for weight updates also exploded. In short, if you don’t scale the data and you have very large values, make sure to use very small learning rate values.

Thanks for ur reply.

* “If your problem is a regression problem, then the output will be a real value.”*.

I found this statement in the above link

My question is on regression problem (multi linear regression). So feature scaling on target variable is not necessary in regression problem (Multi linear regression).

Am i right ? @Osama_Saad_Farouk

it’s necessary for all regression problems due to the same reasons.

Hi @Osama_Saad_Farouk , thanks for ur replies, one last question if target variable (y) range is like 1 to 100 then it is necessary to feature scale it.

Just a confirmation !!

It depends: Suppose you have a dataset that have 1000 values ranging from 1 to 20 and 20 examples that range from 95 to 100, then these 20 values, even if they are few, but would make the weights larger, hence you have to use feature scaling and the range would be from 1 to 100. but if these values are well spread over this range, then you don’t need to use feature scaling.

In general, you have to plot your data and see, if they look well spread over the range then it is okay to not use feature scalling.

For example, here is a plot of how data can range from 1 to 100 and are not well spread and you need to use feature scalling over them:

As you can see, most of the data range from 15 to 30, but some are way far from that range.