# W3 c2 tenserflow

I need help to complete exercise…i have error

We can’t see all of your code, but for starters you have used a different loss function. They specifically told you to use categorical_crossentropy in the instructions, but you have used binary_crossentropy. There are 6 classes here, right?

There are several other things in the instructions that you need to pay attention to. Note the point about the shapes of the expected inputs. Also note that the output of the forward propagation is just the linear activation value (“logits”) and not the activation output. So you will also need to use the from_logits parameter to the loss function to tell it to include the softmax calculation as part of the loss.

3 Likes

U can see other have in this photo
With the proplem

tf.reduce_sum(tf.keras.losses.binary_crossentropy(labels,logits,from_logits=True))

You have the “from_logits” part correct, but you missed the other two points that I made in my previous reply:

1. The shapes of your inputs are not correct. You need to transpose the labels and logits.
2. You are using binary cross entropy. This is a multiclass problem, so you need categorical cross entropy as the loss function. They explicitly mention that in the instructions.
3 Likes

How can i transpose matrix in tenser?

i change t to…
total_loss=tf.reduce_sum(tf.keras.losses.categorical_crossentropy(labels,logits,from_logits=True))

but i face it

Did you try googling “tensorflow transpose”? You will find this documentation page.

2 Likes

Hello paulinpaloalto,

Thank you for the reply. How do I know to transpose the labels and logits? Does the output dimension of Y be (6,m=2)? Thank you!

Here’s a thread which explains why the transpose is required here.

Great! thank you so much!