DLS course 2, Week 3, Exercise 6 Cross entropy cost fails with error about mean

hello I am getting the following error regarding taking the mean even though I have done so. I have read all the threads and debugged my code, but I still am missing something. Can someone spot what I am doing wrong here? The shape of the logits and labels has been transposed as well and are the same when i check


AssertionError Traceback (most recent call last)
in
17 print("\033[92mAll test passed")
18
—> 19 compute_cost_test(compute_cost, new_y_train )

in compute_cost_test(target, Y)
13 print(result)
14 assert(type(result) == EagerTensor), “Use the TensorFlow API”
—> 15 assert (np.abs(result - (0.25361037 + 0.5566767) / 2.0) < 1e-7), “Test does not match. Did you get the mean of your cost functions?”
16
17 print("\033[92mAll test passed")

AssertionError: Test does not match. Did you get the mean of your cost functions?

What is the incorrect value you get for the cost? Are you sure you used the from_logits argument to tell the cost function that it needs to do the softmax internally?

I figured it out. It was the from_logits parameter.

Thank you!