Right! Which means the loss will be between 0 and +\infty.
Here’s a thread from DLS which discusses the cross entropy loss in more detail and shows the graph of the log function between 0 and 1.
Right! Which means the loss will be between 0 and +\infty.
Here’s a thread from DLS which discusses the cross entropy loss in more detail and shows the graph of the log function between 0 and 1.