W2_Video Lecture_Cost Function For Logistic Regression


I’m not quite clear on why at 6:20 mark, professor Ng removed the negative sign from the front of the summation.

I think I followed his explanation when he said that we removed the negative sign (-) because, want to minimize the cost function J(w,b), instead of maximizing the likelihood (referring to the introduction of negative sign (-) at 4:08 mark).

But, I am confused because, throughout first week and, up to this video, the cost function over m example has always had (-1/m) in front of the summation, including the pseudocode in Vectorizing Logistic Regression video. Even the Machine Learning Specialization has always shown the negative sign (-) in front of the cost function and loss function when the logistic regression is discussed.

I don’t know which video you are talking about. Please share a link or screenshot.

But I guess the negative sign is multiplied with the terms of the cost function. For example, below both equations are the same.
X = - (-y^2 - 2)
X = y^2 + 2

@saifkhanengr, @sohamshah is talking about Logistic Regression Cost Function and you are right, the expression changes with the equations.

Thanks @saifkhanengr , @Rashmi.
Yeah, I guess that’s what he must have meant, because he explicitly did mention the removal of the (-) sign but used the shorter form of the expression.

You’ve got it right, @sohamshah.