Week 2 Logistic Regression Gradient Descent video: derivative mistake

The formulas shown are correct. What you need to realize is that the notational conventions are different in the ML/DL world than they are in the math world. Whenever they say “log” here, they mean natural logs, not logs base 10. I’m not sure where this convention came from, but one conjecture is that it’s based on the way things work in MATLAB, which was pretty commonly used in the early days of ML. Of course python has subsequently taken over the world. Note that np.log is also natural log. In fact python uses the same naming as MATLAB: if you want logs base 10, the function is np.log10.

Of course the point you raise also illustrates why you’d be nuts to use logarithms to any base other than e if you’re going to be taking derivatives or integrating. You just get inundated with bogus constant factors and of course the shapes of the curves are the same in any case, so you’d only get pain from using base 10 and no actual advantage.

2 Likes