Cost Function of Linear Regression

What confused me is how the cost function graph is exactly a parabola. I suppose there are many values as training examples that actually turn out this way by providing every discrete possible point of a parabola.
If I said something wrong, then please correct me.

1 Like

Hello @farhana_hossain,

If you want to see that, take b=0 and expand the square of the following cost function:


and see if you can rewrite it into the form of J = Aw^2 + Bw + C

If you want to discuss your calculation steps, please share them.



Not exactly.

The shape is a parabola because it’s plotted vs the weight values, which are real floating point numbers.

The examples are used to learn the best weights, but they aren’t directly involved in the shape of the cost vs. weight curve.


So I need to correct my statement: There is not every possible value of the training example, but every possible value of w, which provides every possible point on the parabola. I hope I am correct this time!

1 Like

I think you are.


thanks <3