Week 2: Practice lab

Hello, I just started the machine learning specialization. I am currently trying to complete the practice lab assignment. As most of you are aware the assignment has two parts: One to write the cost function and one to define the gradient descent function. My code for the cost function is correct and it seems to pass the test code. The code is as follows:

UNQ_C1

GRADED FUNCTION: compute_cost

def compute_cost(x, y, w, b):
“”"
Computes the cost function for linear regression.

Args:
    x (ndarray): Shape (m,) Input to the model (Population of cities) 
    y (ndarray): Shape (m,) Label (Actual profits for the cities)
    w, b (scalar): Parameters of the model

Returns
    total_cost (float): The cost of using w,b as the parameters for linear regression
           to fit the data points in x and y
"""
# number of training examples
m = x.shape[0] 

# You need to return this variable correctly
total_cost = 0
cost=0

### START CODE HERE ###

for i in range(m):
    f_wb=w*x[i]+b
    cost=cost + (f_wb - y[i])**2
    total_cost=1 / (2 * m) * cost

### END CODE HERE ### 

return total_cost

However, when it comes to the gradient descent function I seem to be facing issues when testing it out. I get an assertion error. I cannot fathom what could possibly be wrong in my code. My code is as shown below:

UNQ_C2

GRADED FUNCTION: compute_gradient

def compute_gradient(x, y, w, b):
“”"
Computes the gradient for linear regression
Args:
x (ndarray): Shape (m,) Input to the model (Population of cities)
y (ndarray): Shape (m,) Label (Actual profits for the cities)
w, b (scalar): Parameters of the model
Returns
dj_dw (scalar): The gradient of the cost w.r.t. the parameters w
dj_db (scalar): The gradient of the cost w.r.t. the parameter b
“”"

# Number of training examples
m = x.shape[0]

# You need to return the following variables correctly
dj_dw = 0
dj_db = 0

### START CODE HERE ###
for i in range(m):
    f_wb=w*x[i]+b
    dj_dw_i=(f_wb-y[i])*x[i]
    dj_db_i=f_wb-y[i]
    dj_dw += dj_dw_i
    dj_db += dj_db_i
    dj_dw=dj_dw/m
    dj_db=dj_db/m

### END CODE HERE ### 
    
return dj_dw, dj_db

What is wrong in the code shown above? I have completed the optional labs and everything checks out. Why do I still receive an assertion error?

Hi!

Please check these lines:

dj_dw = dj_dw/m
dj_db = dj_db/m

How many times do you want to do this operation?
How many times is this operation happening in your code?

Hope this helps!

Sam

Please check my code, I am stuck for weeks

Code for assingment 1:

Exercise 1
Complete the compute_cost below to:

Iterate over the training examples, and for each example, compute:

The prediction of the model for that example
𝑓𝑤𝑏(𝑥(𝑖))=𝑤𝑥(𝑖)+𝑏

The cost for that example
𝑐𝑜𝑠𝑡(𝑖)=(𝑓𝑤𝑏−𝑦(𝑖))2

Return the total cost over all examples
𝐽(𝐰,𝑏)=12𝑚∑𝑖=0𝑚−1𝑐𝑜𝑠𝑡(𝑖)

Here, 𝑚
is the number of training examples and ∑
is the summation operator
If you get stuck, you can check out the hints presented after the cell below to help you with the implementation.

UNQ_C1

GRADED FUNCTION: compute_cost


def compute_cost(x, y, w, b):
“”"
Computes the cost function for linear regression.

Args:
    x (ndarray): Shape (m,) Input to the model (Population of cities) 
    y (ndarray): Shape (m,) Label (Actual profits for the cities)
    w, b (scalar): Parameters of the model

Returns
    total_cost (float): The cost of using w,b as the parameters for linear regression
           to fit the data points in x and y
"""
# number of training examples
m = x.shape[0] 

# You need to return this variable correctly
total_cost = 0

### START CODE HERE ###


cost_sum = 0

for i in range(m):

   f_wb =  w * x[i] + b
  
   cost = (f_wb - y[i]) ** 2

   cost_sum = cost_sum + cost 



total_cost = (1 / (2 * m)) * cost_sum
### END CODE HERE ###

return total_cost

Problem no 2

def compute_gradient(x, y, w, b):
“”"
Computes the gradient for linear regression
Args:
x (ndarray): Shape (m,) Input to the model (Population of cities)
y (ndarray): Shape (m,) Label (Actual profits for the cities)
w, b (scalar): Parameters of the model
Returns
dj_dw (scalar): The gradient of the cost w.r.t. the parameters w
dj_db (scalar): The gradient of the cost w.r.t. the parameter b
“”"

# Number of training examples
m = x.shape[0]

# You need to return the following variables correctly
dj_dw = 0
dj_db = 0

### START CODE HERE ###

for i in range(m):

   f_wb = w * x[i] + b

   
   dj_dw_i = (f_wb - y[i]) * x[i]

    
   dj_db_i =  f_wb - y[i]

   
   dj_db += dj_db_i

  
   dj_dw += dj_dw_i

dj_dw = dj_dw / m
dj_db = dj_db / m

### END CODE HERE ### 
    
return dj_dw, dj_db