Hello,

The course (‘**Optional Logistic Regression: Gradient**’) only details the computation of the derivative of h(x) but not h(x^{(i)},\theta). Could someone describe the steps to do so ?

Thanks,

Hello,

The course (‘**Optional Logistic Regression: Gradient**’) only details the computation of the derivative of h(x) but not h(x^{(i)},\theta). Could someone describe the steps to do so ?

Thanks,

Hey,

it’s an application of the chain rule with the inner function being g( \theta ) = \theta_1 x_1 + \theta_2 x_2 + \theta_3 x_3, which is even linear in each of the components of \theta.

1 Like