C1_W1_lab05: Linear regression code questions


Soo, when I finished the theory videos of week one, I went ahead and started to do the labs, but also found other codes of the same course that I tried to do.

One of the codes I tried was the following:
ml-coursera-python-assignments/Exercise1 at master · dibgerge/ml-coursera-python-assignments (github.com). After finishing this assignment, the next code i did was lab05 of c1_w1, from the course’s optional labs.

My issue is that, after working for some time on the first exercise that I linked, and moving to lab05: there were a lot of differences. For example:

In the first exercise this is how I was asked to calculate the cost:

def computeCost(X, y, theta):
    # initialize some useful values
    m = y.size  # number of training examples
    # You need to return the following variables correctly
    J = (1 / (2 * m)) * np.sum(np.square(X.dot(theta) - y))
    return J

But in the lab05 i was asked to do it differently:

def compute_cost(x, y, w, b):
    m = x.shape[0]
    cost = 0
    for i in range(m):
        f_wb = w * x[i] + b
        cost = cost + (f_wb - y[i]) ** 2
    total_cost = 1 / (2 * m) * cost
    return total_cost

These differences are not only present on these specific functions: the way gradient descent is calculated in lab5 and how is asked to be calculated in this github assignment is totally different. So the questions are:

Which is the proper way to apply and implement gradient descent code wise?
Should I use and try to implement w and b all the time?

I’m a little bit confused now after doing these 2 assignments

Hello @Mamey,

Both are proper. The first takes the vectorization approach while the second takes the loop approach. Both will work but the first one will be less time consuming. Please watch the vectorization videos in course 1 week 2 for Andrew’s explanation on vectorization.


Do we need to this code?
because after commenting(i.e. #) this code. We have previous results.

Hello @Hesam_Gholami,

Welcome to our community!

Please check out this thread about deepcopy.


PS: Please open a new thread for any new question. I am closing this thread.