# Regression with multiple features

y=w1x1+w2x2+b
For different x1 and x2 value combinations, is the function y always linear?

combinations are like x1=0.1 and x2=100, x1=100 and x2=-10, and other possible combinations.

It depends on what you mean by “linear”.

In machine learning, “linear” usually means that it is a linear function of the weights and features. That is, the sum of the products.

It doesn’t describe the shape of the function being a straight line.

Since the linear function always doesn’t lead to a straight line, two questions popped into my head:

1. Can scaling features lead the linear function with the multiple features (as I described) to a straight line?

2. Will the cost function for the linear function with multiple possible combinations of w1 and w2 be convex?

Let me begin with a summary

y against x’s squared loss against w’s
y = b + w1x1 linear line convex
y = b + w1x1 + w2x2 linear plane convex

y = b + w1x1 is always a straight line, and always linear, no matter features are scaled or not, no matter with what feature values.

y = b + w1x1 + w2x2 is always a linear plane, and always linear, no matter features are scaled or not, no matter with what feature values.

Always convex.

Actually, @farhana_hossain, I am not sure I fully understand your questions:

it seems to me you think some specific feature values can change a linear line to not a linear line; and
it seems to me you think some specific weight values can change a convex cost function into not a convext function.

However, they cannot happen. So, if you still think they are possible, I need you to further elaborate your reasons. Maybe you can draw how any feature values change a straight line into not a straight line?

Yes! I think again, and I got it now!! Thank-you Raymond! Yes, the cost function is always convex.