Gradient descent iterations

what is
if i<100000: # prevent resource exhaustion
J_history.append( cost_function(x, y, w , b))
p_history.append([w,b]) in gradient descent computation ?i understand j and p are used to store values ,what is meant by prevent resource exhaustion?i dont get it…

Quite simply it is mean that if you perform 100000 iterations of the process, (the loop) then exit the loop to avoid running it infinitely!

1 Like

But we are already telling the loop by giving it number of iterations as an input like
“for i in range(num_iters)”, why would it ever go to infinity?

We don’t have any other criteria for telling it when to stop.


I have not seen this assignment but I think this is what it is referring to!

1 Like

ok i get it may be we are telling the code to store values upto maximum 100000 iterations. For example if it takes more than 100000 iterations to converge we are setting a upper limit for the sake of memory conservation.

Thank you guys…!

1 Like