Hi,
Although I have finished the course I’m in the process of going back through a couple things and manually calculating them. I’m currently on the Multiple Variable Linear Regression, specifically the following equation:
def compute_cost(X, y, w, b):
"""
compute cost
Args:
X (ndarray (m,n)): Data, m examples with n features
y (ndarray (m,)) : target values
w (ndarray (n,)) : model parameters
b (scalar) : model parameter
Returns:
cost (scalar): cost
"""
m = X.shape[0]
cost = 0.0
for i in range(m):
f_wb_i = np.dot(X[i], w) + b #(n,)(n,) = scalar (see np.dot)
cost = cost + (f_wb_i - y[i])**2 #scalar
cost = cost / (2 * m) #scalar
return cost
If I set m = 1 and run the modified code:
m=1
i=0
cost = 0.0
f_wb_i = np.dot(X[i], w) + b #(n,)(n,) = scalar (see np.dot)
return f_wb_i
Then i get 459.99999761940825 but in excel i get 457.2311368
X = [2104, 5, 1, 45] Note I’m only showing the first feature vector rather than all 3.
y = [460, 232, 178]
w = [0.39, 18.75, -53.36, -26.42]
b = 785.1811367994083
Calculating WX
My understanding of the dot product was X0*W0 + X1*W1 + Xn*Wn
so i get:
820.56+93.75-53.36-1188.9 = -327.95
+b to get the model for when i = 0
=-327.95 + 785.18
= 457.23
I’m confused as to why np.dot() is getting a different value and where i have screwed up.