Sir after running the cost function I am getting this error. I have taken the mean of cost also using reduce_mean.

AssertionError: Test does not match. Did you get the mean of your cost functions?

The shape of logits and labels should be of the form (num examples, num classes).

The inputs to the function are (num classes, num examples). Are you taking care of that?

Also, the parameter `from_logits`

should be taken care of.

So should I transpose the tensors logits and labels.?

Also could you tell me the role parameter from_logits play? I could not figure it out from the official documentation page of tensorflow.

Regards

Should you transpose? yes.

Logits are used when the normalization is not done i.e. you’re using just the raw outcomes. Hopefully, this link helps: python - from_logits=True and from_logits=False get different training result for tf.losses.CategoricalCrossentropy for UNet - Stack Overflow

Sir by normalization you mean that for last layer in this case performing softmax operation. In forward propagation we have calculated only upto z3. The input in the cost function is logits is it z3? Should I calculate a3 by calling softmax function and then pass it on to cost function?

Regards

Setting the `from_logits`

flag to True is sufficient.

Sir,

I am taking these three steps:

[Removed code]

I am getting an assertion error:

```
AssertionError: Test does not match. Did you get the mean of your cost functions?
```

The loss function you want to use is `tf.keras.losses.categorical_crossentropy()`

. It’s provided in the exercise write up in the markdown cell right above the code cell.