Why do primitive learning algorithms such as linear regression do not scale well with big data unlike neural networks

Why do primitive learning algorithms such as linear regression do not scale well with big data unlike neural networks

1 Like

Hello @sunnybala

The simple logistic to apply on this issue would be linear regression only can detect a linear relations between variables where as when it comes to big data there is more linearity or non-linear relations could be captured with neural networks where in there is more space to inclusion of various features, bias as well as algorithm to study in big data. Linear regression cannot be of use in high variance problem.

Linear regression is not recommend if the observations are not proportions to features. Also they are prone to outliers creating mistakes in analysis when dealing with bigger data.

Regards
DP

1 Like

I am curious why you believe this. Did you see it one of the lectures?