Compute the total loss

Hi,

Is any mistake in the code that I am using. I am getting a assertion error.

I have used the defined function compute_total_loss_test. It cannot be modified

def compute_total_loss_test(target, Y):
pred = tf.constant([[ 2.4048107, 5.0334096 ],
[-0.7921977, -4.1523376 ],
[ 0.9447198, -0.46802214],
[ 1.158121, 3.9810789 ],
[ 4.768706, 2.3220146 ],
[ 6.1481323, 3.909829 ]])
minibatches = Y.batch(2)
for minibatch in minibatches:
result = target(pred, tf.transpose(minibatch))
break

print("Test 1: ", result)
assert(type(result) == EagerTensor), "Use the TensorFlow API"
assert (np.abs(result - (0.50722074 + 1.1133534) / 2.0) < 1e-7), "Test 1 does not match. Did you get the reduce sum of your loss functions?"

### Test 2
labels = tf.constant([[1., 0., 0.], [0., 1., 0.], [0., 0., 1.]])
logits = tf.constant([[1., 0., 0.], [1., 0., 0.], [1., 0., 0.]])

result = compute_total_loss(logits, labels)
print("Test 2: ", result)
assert np.allclose(result, 3.295837 ), "Test 2 does not match."

print("\033[92mAll test passed")

compute_total_loss_test(compute_total_loss, new_y_train )

In the assignment part I have modified compute_total_loss function.

def compute_total_loss(logits, labels):
    """
 Computes the total loss
Arguments:
logits -- output of forward propagation (output of the last LINEAR unit), of shape (6, num_examples)
labels -- "true" labels vector, same shape as Z3

Returns:
total_loss - Tensor of the total loss value
"""

#(1 line of code)
# remember to set `from_logits=True`
# total_loss = ...
# YOUR CODE STARTS HERE

Moderator Edit: Solution Code Removed.

# YOUR CODE ENDS HERE
return total_loss

Please provide your insight what might be the reason for failure.

Yes. Here is the checklist.

The compute_total_loss_test code is correct and it is just testing your code.

  1. Make sure you did the transpose on the inputs. Yeah (result = target(pred, tf.transpose(minibatch)))

  2. Make sure you use the from_logits option to tell the cost function that you are giving it logits and not activation values. - Yeah It is logits

  3. Make sure you use reduce_sum and not reduce_mean to get the final scalar value. - I have used the reduce_sum function

  4. Make sure you use the loss function specified in the instructions. Used tf.keras.losses.categorical_crossentropy

  5. Make sure you specify the positional arguments to the loss function in the correct order.

Yeah the positional arguments are corrtect.

No. You don’t have to do your code here. The checklist mentions the part of your code that you have to do between # YOUR CODE STARTS HERE and # YOUR CODE ENDS HERE. I see your code for compute_total_loss is not using transpose.

1 Like

Please do not share your code on the forum. That’s not allowed by the Code of Conduct.

Posting your error messages is sufficient.