In the Course Neural Networks and Deep Learning, Week 2, Video Logistic Regression Cost Function, Cost Function is defined as how good an algorithm does on an entire training set, then in the end of the video, why does Andrew say cost function should be as low as possible?
The point is that the cost or “loss” function measures how well the algorithm’s predictions match the “labels”, which are the known correct answers. The larger the cost, the worse the matching of the predictions. So the goal of the training and back propagation is to drive the parameters of the model to produce a lower cost. The lower the cost, the better the predictions.
2 Likes