I wrote my own simple Python code for (univariate) Simple Linear Regression that basically involves the direct results of gradient descent that is run within a for loop (so no libraries apart from Pandas being used). I am, however, getting an error in the same. So is there any chance of me floating my code here somewhere so I can have someone pointing out where I’m going wrong in the code, to me?

(Sorry if this is the wrong category, I’m new to the community.)

Thank you for responding! So as you may see in the attached screenshots, the error is shown for a given value of learning rate (denoted by the variable LR) and iterations (denoted by “it”)

Moreover, on further experimentation on values of LR and number of iterations (including Learning Rate in the fifth order decimals), there are some results that come up, however, these are not the actual regression line equations, as verified by me through Excel. Please advise!

In this code there are same wrong implementation like w , b are not zero number they are a vector of zeros …in addition to the implementation of drrifw and drrifb are not correct as after you calculate sum you should multiply it by appropriate of xi like this photo

I’ll think through what you said, and make the necessary corrections/come back after watching the videos again, incase I’ve more questions. Thank you for your time, sir!