Practise Quiz Train the model with gradient descent Q1

I puzzled over this question.

From the cost function intuition example, wouldn’t the change in w , being rise over run , be negative as w moves towards 1 from the right?


So in this scenario wouldn’t \frac{\partial}{\partial b} J(w,b) be negative? and thus w be becoming more positive?

‘w’ becomes a larger value, becaues subtracting a negative number is equivalent to addition (it’s algebra).

Indeed. So how does w converge in this scenario?
I guess that maybe that’s the point. It can’t?

It certainly can converge, the weight values will move toward whatever gives the minimum cost.
The weights can be either positive or negative.
It’s only the cost value that is always positive.

1 Like