UNQ_C3
# GRADED FUNCTION: compute_gradient
def compute_gradient(X, y, w, b, *argv):
"""
Computes the gradient for logistic regression
Args:
X : (ndarray Shape (m,n)) data, m examples by n features
y : (ndarray Shape (m,)) target value
w : (ndarray Shape (n,)) values of parameters of the model
b : (scalar) value of bias parameter of the model
*argv : unused, for compatibility with regularized version below
Returns
dj_dw : (ndarray Shape (n,)) The gradient of the cost w.r.t. the parameters w.
dj_db : (scalar) The gradient of the cost w.r.t. the parameter b.
"""
{code removed by mentor}
### END CODE HERE ###
return dj_db, dj_dw

Apparently you have not yet implemented your solution exactly as required. I suggest you to take a break, copy your code to somewhere else for backup, and then restart it from scratch or with the hint provided underneath the exercise cell.

Please be noted that we don’t solve the problem for you. Also, I have removed the code for you, as sharing assignment work is not allowed by the code of conduct.

Hi @rmwkwok
I did follow the hints. If you could point to which section requires revisiting I’d be so much thankful. I understand my problem can’t be solved. However guiding me on where I’ve erred would help me out.
Thanks,
Vinay

It is your chance to discover and solve the problem, but I can suggest a way for you to help yourself.

Add, in between every line of your code, a print statement to print the result of the previous line. Then, run your function with a simple dataset of 2 samples and 2 features, and a simple set of w and b. By simple, you can use distinct integer values, but avoid zeros and ones.

Then, on a piece of paper, do the math to get the expected results to compare with the printed results. As soon as you see any mismatch, you locate the section. This exercise shouldn’t take you more than 30 minutes.

This is a general technique that you can use for many future assignments. @Vinay_SR, we all can make mistakes, but we need to be able to find and fix them.