# Week 3 Assignment Binary cross entropy

Hello,
While computing cost using tensorflow we are using tf.reduce_mean(tf.keras.losses.binary_crossentropy(y_true = …, y_pred = …, from_logits=True)). And it is mentioned that this is equivalent to mean_reduce(max(logits, 0) - logits * labels + log(1 + exp(-abs(logits))), axis=-1).
But in notes we have learning that binary cross entropy is sum(y_actual *log(y_pred) + (1-y_actual) * log(1-y_pred)
So why is this different cross entropy used and what is significance of this new cost function

Hi, @Vinayak.

In the exercise `y_pred` is a tensor of logits, so it is explicitly computing `sigmoid(y_pred`), but other than that it is the same formula.

Here is the derivation (from the source code):

``````  For brevity, let `x = logits`, `z = labels`.  The logistic loss is
z * -log(sigmoid(x)) + (1 - z) * -log(1 - sigmoid(x))
= z * -log(1 / (1 + exp(-x))) + (1 - z) * -log(exp(-x) / (1 + exp(-x)))
= z * log(1 + exp(-x)) + (1 - z) * (-log(exp(-x)) + log(1 + exp(-x)))
= z * log(1 + exp(-x)) + (1 - z) * (x + log(1 + exp(-x))
= (1 - z) * x + log(1 + exp(-x))
= x - x * z + log(1 + exp(-x))
For x < 0, to avoid overflow in exp(-x), we reformulate the above
x - x * z + log(1 + exp(-x))
= log(exp(x)) - x * z + log(1 + exp(-x))
= - x * z + log(1 + exp(x))
Hence, to ensure stability and avoid overflow, the implementation uses this
equivalent formulation
max(x, 0) - x * z + log(1 + exp(-abs(x)))
``````

Good luck with the assignment

4 Likes

My code is giving syntax error in Week 3, Assignment (last graded assignment)

Code snippet:

Dense(40, activation = ‘relu’)

returns error massage: Syntax error, Dense(40…