ConvNet Face Recognition `triplet_loss` wrong assertion!?!

    # Step 4: Take the maximum of basic_loss and 0.0. Sum over the training examples.
    loss = tf.reduce_sum(tf.maximum(basic_loss, 0.0))

The test code:

y_pred_perfect = ([[1., 1.]], [[1., 1.]], [[1., 1.,]])
loss = triplet_loss(y_true, y_pred_perfect, 5)
assert loss == 5, "Wrong value. Did you add the alpha to basic_loss?"

The loss is [[5,5]] which when reduced over all axes is 10. How can it be 5?

Recall that the square of the L2 norm is the sum of the squared differences: ||𝑥−𝑦||22=∑𝑁𝑖=1(𝑥𝑖−𝑦𝑖)2

1 Like