W3_Classification with Perceptron - Gradient Descent_The origin of the loss function

Where does the loss function come from?
I have recalled all the material in the lecture notes, but there is no detailed explanation about it.

It’s presented at time time mark in this video.

It is explained a little further in the associated lab:

Hello I found watching Andrew Ng lesson through 23:00 to 46:00 quite enlightening about the derivation. Sharing for others as well. cheers

[Locally Weighted & Logistic Regression | Stanford CS229: Machine Learning - Lecture 3 (Autumn 2018)] {moderator edit: incorrect link removed - see below for correct link}

It seems your link is pointing to the wrong location. Can you please re-share it?

https://youtu.be/het9HFqo1TQ?feature=shared

Hello, yep youre right, the link is up here watched it after 21:00 up untill end of logistic regression. thanks for informing.

Thanks!