Cost increasing after more instances of gradient desecen

Week 1: C1_W1_Lab04_Gradient_Descent_Soln


As you can see the cost is increasing in iterations 2000, 4000, 5000, 7000, 9000
Therefore the final output may have a cost higher than in other point of gradient decent. I also built a model based on this. It converged on 4.287… (cost) but at one point it was able to reach 0.000…(cost).

How Do I prevent this??

The cost is decreasing, not increasing, 10 to -2 is bigger than 10 to -4!

Indeed. Look at the exponents.