When I reach 2.6 on gradient descent, I keep getting error messages. I can’t edit the cell that seems to be contributing to the error. It is around the lambda variable not being defined. I tried to submit as is, but the autograder wouldn’t accept it. Below is the error message for gradient descent.
--------------------------------------------------------------------------- TypeError Traceback (most recent call last) in 8 9 w,b, J_history,_ = gradient_descent(X_train ,y_train, initial_w, initial_b, —> 10 compute_cost, compute_gradient, alpha, iterations, 0) in gradient_descent(X, y, w_in, b_in, cost_function, gradient_function, alpha, num_iters, lambda_) 40 # Save cost J at each iteration 41 if i<100000: # prevent resource exhaustion —> 42 cost = cost_function(X, y, w_in, b_in, lambda_) 43 J_history.append(cost) 44 TypeError: compute_cost() takes 4 positional arguments but 5 were given
Expected Output: Cost 0.30, (Click to see details):
This is because the cell throwing the error contains a test case for the code you added to some function. In this case, it’s the compute_gradient() function . This does not indicate an error in the test case. It indicates an error in your code.
Also, be sure that every time you open a notebook (or modify a cell), you run all of the cells starting from the top of the page. This will keep the state of the notebook current with any changes you’ve made.
I re-ran all the cells, but I am still seeing error message. I can not modify the compute_gradient() function that seems to be prompting the error. It is the cell right before 2.6 in Week 3 Logistic Regression Lab ---------------------------------------------------------------------------
ValueError Traceback (most recent call last)
in
8
9 # UNIT TESTS
—> 10 compute_gradient_test(compute_gradient)
~/work/public_tests.py in compute_gradient_test(target)
51 dj_db, dj_dw = target(X, y, test_w, test_b)
52
—> 53 assert np.isclose(dj_db, 0.28936094), f"Wrong value for dj_db. Expected: {0.28936094} got: {dj_db}"
54 assert dj_dw.shape == test_w.shape, f"Wrong shape for dj_dw. Expected: {test_w.shape} got: {dj_dw.shape}"
55 assert np.allclose(dj_dw, [-0.11999166, 0.41498775, -0.71968405]), f"Wrong values for dj_dw. Got: {dj_dw}"
ValueError: The truth value of an array with more than one element is ambiguous. Use a.any() or a.all()
You cannot modify the cell prior to Section 2.6, because it contains a test case for the compute_gradient() function. There is no problem with the test cell - the problem is in the code you added to compute_gradient().
Thanks, but I am still lost. I see the cell that I can edit, but not sure how to edit the code correctly. The two cells right before 2.6 are not right. The db and dw expected outputs are flipped. I understand the issue might be around passing a scalar when a vector is expected (or vice versa). Can you help explain why the code is not returning the correct value?