Hello, can someone help me to figure out what is wrong in below as i am trying to solve Excercise-4 Prediction but getting erros;
UNQ_C4
GRADED FUNCTION: predict
def predict(X, w, b):
“”"
Predict whether the label is 0 or 1 using learned logistic
regression parameters w
Args:
X : (ndarray Shape (m,n)) data, m examples by n features
w : (ndarray Shape (n,)) values of parameters of the model
b : (scalar) value of bias parameter of the model
Returns:
p : (ndarray (m,)) The predictions for X using a threshold at 0.5
"""
# number of training examples
m, n = X.shape
p = np.zeros(m)
# Loop over each example
for i in range(m):
z_i = np.dot(X[i],w) + b
f_wb_i = sigmoid(z_i)
cost += -y[i]*np.log(f_wb_i) - (1-y[i])*np.log(1-f_wb_i)
cost = cost / m
return cost
p[i] = f_wb >= 0.5
return p
Hello @Sultan_Mubashir you don’t need to calculate cost in this exercise, you just have to return p. So remove the lines (28, 30) in which you are calculating costs.
SideNote: The error is due to the line cost += …, because it is viewed as cost = cost + …, as no cost is declared earlier so you are getting error.
You have to run all the above cells every time you open your notebook.
One more thing. Users are not allowed to share their code in a public thread. Make sure to delete it to honor to code of conduct of our community.
It means that your code of predict function doesn’t know what is f_wb. I suggest checking the hints and more hints given to you just below after that cell.
This is the hint i got for f_wb. Should i copy and paste it in the code?
You can calculate f_wb as
for i in range(m):
# Calculate f_wb (exactly how you did it in the compute_cost function above)
z_wb = 0
# Loop over each feature
for j in range(n):
# Add the corresponding term to z_wb
z_wb_ij = X[i, j] * w[j]
z_wb += z_wb_ij
# Add bias term
z_wb += b
# Calculate the prediction from the model
f_wb = sigmoid(z_wb)
@Sultan_Mubashir I would suggest you to brush up your python language (or any programming language) knowledge. You will figure out these types of mistakes easily then.