W2_A2_Derivatives of Logistic Regression

I don’t understand why dw = xdz

Thank. :pleading_face::melting_face::robot:

Hi @Enfant,

You need to learn differentiation. You may google some online tutorials to kick start.

I will explain the steps as if you have already known differentiation. If you think you don’t, then read my explaination, hold on to it, and then google tutorials, learn differentiation, then come back afterwards.

To begin with, it is this course’s convention to call \frac{\partial{L}}{\partial{w_1}} as dw_1. In other words, we can call \frac{\partial{L}}{\partial{\text{something}}} as d\text{something}.

Next, by chain rule, \frac{\partial{L}}{\partial{w_1}} = \frac{\partial{L}}{\partial{z}}\frac{\partial{z}}{\partial{w_1}}, because L depends on w_1 through z.

Using our convention again, \frac{\partial{L}}{\partial{z}} = dz

Lastly, by the product rule (in differentiation), \frac{\partial{z}}{\partial{w_1}} = x_1.

Consequently, dw_1 = \frac{\partial{L}}{\partial{w_1}} = \frac{\partial{L}}{\partial{z}}\frac{\partial{z}}{\partial{w_1}} = x_1dz.

The convention is our convention, so you won’t necessarily see it anywhere else, so just accept it. DIfferentiation is something you need to learn, and make sure to learn at least the idea of Chain Rule and Product Rule to completely understand my explanation. You will also come across other rules which could be helpful in your future data science journey.

Cheers,
Raymond

4 Likes

Thank u a lot about your explain :star_struck: