Right! The point is that our \hat{y} values are between 0 and 1. Take a look at the graph of the natural log function and you’ll see that it is negative for the domain (0, 1). The range of the function on that domain is (-\infty, 0). So we need to multiply by -1 to get a positive value for the cross entropy loss.

Here’s a nice explanation from Raymond of cross entropy loss.