Compute and display cost and gradient with non-zero w and b
test_w = np.array([ 0.2, -0.5])
test_b = -24
dj_db, dj_dw = compute_gradient(X_train, y_train, test_w, test_b)
print(‘dj_db at test w and b:’, dj_db)
print(‘dj_dw at test w and b:’, dj_dw.tolist())
UNIT TESTS
compute_gradient_test(compute_gradient)
dj_db at test w and b: -0.6
dj_dw at test w and b: [-44.83135361795273, -44.37384124957207]
AssertionError Traceback (most recent call last)
in
8
9 # UNIT TESTS
—> 10 compute_gradient_test(compute_gradient)
~/work/public_tests.py in compute_gradient_test(target)
51 dj_db, dj_dw = target(X, y, test_w, test_b)
52
—> 53 assert np.isclose(dj_db, 0.28936094), f"Wrong value for dj_db. Expected: {0.28936094} got: {dj_db}"
54 assert dj_dw.shape == test_w.shape, f"Wrong shape for dj_dw. Expected: {test_w.shape} got: {dj_dw.shape}"
55 assert np.allclose(dj_dw, [-0.11999166, 0.41498775, -0.71968405]), f"Wrong values for dj_dw. Got: {dj_dw}"
AssertionError: Wrong value for dj_db. Expected: 0.28936094 got: 0.42857142180110824
Expected Output:
dj_db at test w and b (non-zeros) -0.5999999999991071
dj_dw at test w and b (non-zeros): [-44.8313536178737957, -44.37384124953978]