C2_W2 Loss vs Cost

Loss is the error estimation for one predicted value vs the real one. Cost is the average across all losses in the training set. Why sometimes Andrew uses loss as an example for error estimates but at other times he uses Cost?
I thoughted the cost is what we need to estimate for any model and get the needed parameter when the cost is the lowest (with help of gradient descent).

Loss and Cost are often used interchangeably. The use isn’t consistent in the industry.

But the term “loss” is used to talk about a single sample even on the slide you shared.