Reasons for Cost function ,not getting saturated

I tried to implement Polynomial Regression without scikit . But Whenever i try to minimize my cost function using gradient descent , it decreases but it never saturates no matter how many iterations i use gradient descent on . Can someone suggest what could be the possible reasons for it ?

Tips:

  • Normalize the data set before training.
  • Try different learning rates.
  • Try using more or fewer iterations.

If you want more suggestions, please post a plot of the cost history during training.

Also, when you say “not saturated”, I think you meant “did not converge to a stable minimum value”?

@TMosh , yes that is what i mean when i say not saturated .
cost function
and this is my cost function , please take a look at it .

The cost is decreasing, so that’s good.
Have you tried the issues I suggested?