Hi,
So far all my tests have passed, but I have a question regarding the cost while the model is training. According to the screenshot I took below, the cost increases as training progresses. Shouldn’t it decrease instead?
Thanks!
Hi,
So far all my tests have passed, but I have a question regarding the cost while the model is training. According to the screenshot I took below, the cost increases as training progresses. Shouldn’t it decrease instead?
Thanks!
Yes, it’s interesting that the accuracy improves but the cost increases.
I’ll look into why that happens.
Here is an explanation. Fortunately I had seen this issue before, and left myself notes about it within my notebook.
Since the model uses stochastic gradient descent, the cost should be computed using only the current example, not the entire set of examples.
So compute the cost using Y_oh[i], rather than Y_oh.