# Compute and display cost with non-zero w and b

test_w = np.array([0.2, 0.2])
test_b = -24.
cost = compute_cost(X_train, y_train, test_w, test_b)

print(‘Cost at test w and b (non-zeros): {:.3f}’.format(cost))

# UNIT TESTS

compute_cost_test(compute_cost)

# Compute and display cost with non-zero w and b

test_w = np.array([0.2, 0.2])
test_b = -24.
cost = compute_cost(X_train, y_train, test_w, test_b)

print(‘Cost at test w and b (non-zeros): {:.3f}’.format(cost))

# UNIT TESTS

## compute_cost_test(compute_cost) Cost at test w and b (non-zeros): 3.200

AssertionError Traceback (most recent call last)
in
8
9 # UNIT TESTS
—> 10 compute_cost_test(compute_cost)

~/work/public_tests.py in compute_cost_test(target)
24 b = 0
25 result = target(X, y, w, b)
—> 26 assert np.isclose(result, 2.15510667), f"Wrong output. Expected: {2.15510667} got: {result}"
27
28 X = np.random.randn(4, 3)

AssertionError: Wrong output. Expected: 2.15510667 got: 2.772340119495528

Expected Output:
Cost at test w and b (non-zeros): 0.218

Why are the expected outputs different?
Expected: 2.15510667 and also 0.218?

I’ve looked at all the hints, and made several attempts to modify code, accordingly, but have not got the correct answer for this second unit test.

how do I debug? My code looks correct, but there are several functions, so I’m not sure which part is failing?

Note: I do get the expected output in the first unit test
Cost at initial w and b (zeros): 0.693

I solved it! how I had it originally, but changed after looking at hints.