Minor error in video - Course 1, Week 2

I believe there is a small error in the Loss Function written at 3:28 in the video “Logistic Regression Cost Function” in Week 2 of Course 1. I believe the second term should be - (1 - y) log (1 - y_hat), not + (1 - y) log (1 - y_hat)

The formula is correct as written: check the parentheses. What Prof Ng did there is factor out the -1 since both log terms are negative. The logarithm of a number between 0 and 1 is negative, but we always define loss or cost to be positive.

One other thing to note here, which I don’t remember if Prof Ng explicitly mentions, is that all instances of “log” here are natural logs, not logs base 10. The notation in the ML/DL world is different than in the math world.

Ah, yes, you’re right. I missed the parantheses. Thank you for pointing that out. :slightly_smiling_face:

Also, thanks for the note about natural logs. I suspected as such based on the derivative of log that Prof Ng writes in a future video.

Great! Yes, if you know calculus, you’ll see right away that the logs are natural. When you’ll be taking derivatives, you’d be crazy to use anything else: you would just get bogus constant factors strewn about the place for no added value in terms of behavior. The shapes of the curves are all the same, so why not use e and be happy? :nerd_face:

1 Like