Gradient descent C1_W1

When there is a cost function that has multiple minima, and by performing gradient descent we get to local minimum. But the cost function would be minimum when we get to Global minimum right. How do we get there when we have already performed gradient descent and arrived at some local minimum? How do we approach this and get an optimum solution and get the minimize the cost function? PLEASE HELP ME WITH THIS DOUBT

1 Like

Hello @Sathvik_R

Welcome to the community.

In course 1 we deal with 2 cases: 1. linear regression and squared error loss function and 2. Logistic regression with logistic loss function.

In both these cases, the cost function will be convex and there will only be a global minima and no local minima.

In Course 2, however, we will look at more advanced models where the issue of local minima does come up.

So, for now you dont have to worry about getting stuck in a local minima during gradient descent, because there will be only 1 global minima for the models that we are looking at.

Yes I understood that the squared error loss function will always have a a bowl shaped graph. So, what you mean is in the specialization we will learn how to tackle such problems and be able to find the optimum solution, right?