Please pay attention to the logits flag in the loss function:
loss=tf.keras.losses.BinaryCrossentropy(from_logits=True),
When logits is True
, softmax is not required as the output activation function. See this as well.
Please pay attention to the logits flag in the loss function:
loss=tf.keras.losses.BinaryCrossentropy(from_logits=True),
When logits is True
, softmax is not required as the output activation function. See this as well.