How to do this…Can anyone help me to fix this error?

Hi @Dhivya_R

Welcome to the community!

The implementation of the gredient isn’t correct please follow these steps

first compute this term

and after that update the parameters using these equations

\begin{align*}& \text{repeat until convergence:} \; \lbrace \newline \; & \phantom {0000} b := b - \alpha \frac{\partial J(w,b)}{\partial b} \newline \; & \phantom {0000} w := w - \alpha \frac{\partial J(w,b)}{\partial w} \tag{1} \; &
\newline & \rbrace\end{align*}

Finally You can check the hint below the code

Best Regards,

Abdelrahman

I am also stuck here. My value difference is just 0.1 for dj_db. Please someone should assist us

Thanks for your reply sir.

As per hint, I have done the implementation of the gradient. But still having issue. If I first compute eqn (2) and (3), we need to know about initial parameter values (b and w). can I share my implementation of the gradient code here, sir?

compute for sigmoid fx

then loss for db and dw

then divide db/m and dw/m

then return db,dw

@Dhivya_R

The intial values of w and b send to you as aparameters in the function def compute_gradient(x, y, w, b) use these values to compute the f(x) as f(x)= w*x^{(i)}+b

Sorry for couldn’t share the code here as it’s not allowed

You didn’t compute the loss function you just compute the error by f(x) - y and multiply that with the partial drivative of the f(x) according to these equations

after tha you divide by m and return 2 variables db,dw