I am not able to understand the optional lab that implements Linear Regression using Scikit Learn. I want to learn the Scikit learn library. Would help if good resources on scikit and matplotlib are shared?
Hi there,
In general, the scikit-learn documentation is quite good. Feel free to take a look at these examples and the underlying documentation:
- Linear Regression Example β scikit-learn 1.1.2 documentation
- Principal Component Regression vs Partial Least Squares Regression β scikit-learn 1.1.2 documentation
Can you outline what specifically is not clear?
Best regards
Christian
Scikit-learn has a web page with tutorials.
https://scikit-learn.org/stable/
also, what do those values inside fit function do? Fit method takes training data set in its argument,right? but in the picture, I donβt quite get it?
Hi!
The values inside the fit function are a list of data points and labels.
The data is an array of arrays containing 2 values, and the labels are a single array with values representing the labels for the corresponding rows in the first array.
Regarding the picture:
You can see that the linear function was fitted to the training data, meaning the cost function (least squares sum) is minimised, meaning the function coefficients get parametrised resp. calibrated:
- gradient of the linear function
- y intercept
so that the function suits all training observations with feature x and label y in an optimal way.
In general x does not need to be 1D but the linear model can be dependent on more dimensional Features of course, e.g describing a plane y = f(x_1,x_2) etc.
Check out the description here:
Best
Christian