Loss Function Labeling Question? W3 Lab06

Hello,
Can someone please explain why loss function value between the individual points and the sigmoid curve in the Week 3 logistic regression gradient descent lab shows as 0.7 rather than 0.3 for all the points by default? (when w & b are set to 0)
Because -Log(1-(1/2)) = 0.3, not 0.7

Figured it out… TIL that all the log notations are actually natural logarithms (ln)… this strikes me as unnecessarily confusing. Why not just explicitly write ln() rather than log() when in mathematical notation? I get that the base log in Python is the natural log, but why not be explicit, or at least have a note somewhere in the course material offering a heads up about this for those of us unaware of this potentially confusing convention?

1 Like

Hi @Tim_Gregg, what course is this from? Thanks!

David

Could you please be specific what is the understanding after you figure it out? do you mean the log e default presentation?

“ln” tends to be the mathematical notation.

Since the folks who design programming languages tend to not be mathematicians, the course uses the programming convention of log() for natural log. log10() would be the base 10 log.

Different fields use different conventions.

1 Like

1 Like

Thanks for the explanation @TMosh
After a frustrating half hour trying to figure this discrepancy out myself, GPT4 helped me understand that that’s what was going on here

Or you could have posted the question here on the forum.

I did! And then fiddled with it a bit more. Thanks for your help