Difference between Cost function and Loss function

Hey Machine Learning folks,
I really don’t understand the loss function can anyone explain in understandable way.
also explain the difference between cost function and loss function in logistics regression ?
and why we don’t calculate loss function in linear regression.

Thank you

Hi @manobharathi_m ,

In general, Prof Ng refers loss as per single example, and cost as the total of the loss for all the examples. These terms are synonymous. Paying attention to the formula would give indication whether it is loss or cost term that Prof. Ng is referring to. For cost, you would see the summation mark in the formula.

Calculating the loss is a way of finding out how well the model’s prediction is to the true label. If the loss is very small, then the prediction is very close to the true label. In the case of Logistic regression, a binary classifier model with target label being either 0 or 1, the loss function is different from that used in Linear regression. At timestamp 2:25 of the lecture video Cost function for logistic regression, Prof. Ng explained why the **Square error cost function ** used in Linear regression can not be used in Logistic regression model because the cost curve becomes non-convex, unable to find a global minimum point (the lowest error).

It is often helpful to revisit the lecture videos if in doubt.

1 Like

Thanks @Kic , Now I get clarity about it.