# Course 1 Week 2,Logistic Regression Gradient Descent

Could you explain how the slope is positive/negative in the gradient descent

Hi @naina_dwivedi, hereâ€™s a thread which you can look upon. Please let me know if you get your answer through this link, otherwise, we can always have a more deeper look to what you have asked for. Thanks!

Thanks for the explanation. However,I want to know how the slope is decided to be positive or negative. for e.g in the gradient descent lecture,the graph shows when the random value of w is small,the derivative is negative and hence the updated w value is the increased one,and in turn the descent is in positive direction. Could you explain how the slop value is postive.

Hi @naina_dwivedi, okay, let me explain it for you. As Prof. Andrew mentions in week 2 of Course 1, gradient descent (GD) is a series of functions which identifies the slope in all directions at any given point and adjust the parameters of equation to move in a direction of the negative slope (the minimum point).

Logistic Regression seems to be a very unique case, where cost function is actually convex. But, itâ€™s not guaranteed always. If we choose a higher learning rate, we can always get a divergence/overshoot rather than convergence.Thus, the change in the objective function is positive at each step rather than negative. To make the gradient descent work, the learning rate should be small which could give an approximate output. Hope, this explanation will work for you.

2 Likes