Cost error.Do not undestand, where I made mistake

ValueError: Shapes (2, 4) and (2, 6) are incompatible.
Cannot compute this one. any hints?
1: I transposed logits and labels.
2: from_logits=True

okay, I did logits[0], labels[0] and I got:
tf.Tensor(0.06969354, shape=(), dtype=float32)

AssertionError: Test does not match. Did you get the mean of your cost functions?

Expected output
tf.Tensor(0.4051435, shape=(), dtype=float32)

I used from_logits=False and i got:
tf.Tensor(0.39053392, shape=(), dtype=float32)

The dimension mismatch means that you “hard-coded” the number of classes in your “one hot” routine.

The other error is that you are passing the results as logits to the cost function, right? So you need to specify “True” for that flag. Here’s a thread with a bit more info.

1 Like

It did not help, to be honest.

You’ve mentioned lots of different mistakes in your various posts. For example, it’s a mistake to index the labels or logits: you should use the whole tensors. You need also to transpose them and you need also to set the from_logits flag to True.

So what is the current incorrect value that you are getting?

without indexing i’ve got:
ValueError: Shapes (2, 4) and (2, 6) are incompatible (with transposing).

ValueError: Shapes (4, 2) and (6, 2) are incompatible ( without transposing)

I gave you the answer about that on my very first reply on this thread. Your “one hot” routine is incorrect: you are hard-coding the number of classes to 4. That passes the test for that function, but fails on any other test case that doesn’t involve 4 classes.

The inputs for the compute_cost test case are the output of applying the one hot routine to one of the input datasets.

I’ve changed it to (depth,) and it works.
Thanks for your time and have a nice day.

But still a bit strange that mistake was not shown.

Yes, it would be better if they had two different test cases with different numbers of classes to catch that type of bug. But it is always a mistake to “hardcode” things when you aren’t forced to do that.