Gradient Descent Negative Values

Hi,

I had a question regarding this video. https://www.coursera.org/learn/machine-learning/lecture/2f2PA/gradient-descent

Considering the fact that Cost Function is a series of squared, hence positive numbers, is it possible that in the diagram in that video, some of the values for cost function is zero?

I’d appreciate any clarification

Squared error may be zero, only it cannot be negative because, as you said, errors are squared.

However, Andrew wrote down on the slide that it was not a squared error cost function, so it was his choice to pick any cost surface that can have negative value. In fact, the choice does not matter much because the purpose here is to demo the idea of gradient descent - the idea of going down the hill.

Cheers,
Raymond

1 Like

Hi Raymond,

I appreciate the response and thanks for the clarification.
I almost forgot the other methods to calculate cost function, too focused on that negative bit, and playing math in my brain, so I didn’t notice the note right before my eyes :D.

Regards,
Mostafa

1 Like

You are welcome, @mostapha!