DLS C2W3 compute_total_loss_test Grader Error (2024)

When creating a post, please add:

  • Week # must be added in the tags option of the post.
  • Link to the classroom item you are referring to:
  • Description (include relevant info but please do not post solution code or your entire notebook)
    After coding the function “compute_total_loss” I keep getting same error from “compute_total_loss_test Grader”. I just need someone to confirm that the grader is correct because when I removed the “tf.transpose” of “minibatch” in the “for loop” the grader was fine and the i get “All test passed”. Thanks in advance for your assistance.

Yes, grader is correct and you have to use that transpose. Please read this for more details.

Hi, I have seen the link you have shared, the issue is that when i use transpose (as it should be) in the “compute_total_loss” function I get the grader error and when I move to the next section “3.3- Train the model” it shows in model function code that in this code line “minibatch_total_loss = compute_total_loss(Z3, tf.transpose(minibatch_Y))” that the transpose is in the argument itself, this way i get another error after running this code <parameters, costs, train_acc, test_acc = model(new_train, new_y_train, new_test, new_y_test, num_epochs=100)> but if i remove the transpose from this code line “minibatch_total_loss = compute_total_loss(Z3, tf.transpose(minibatch_Y))” to be this “minibatch_total_loss = compute_total_loss(Z3, (minibatch_Y))” in that case i get everything in good order and all results / outputs are as expected.

What error you are getting? Please share that with us. And never move to the next section until you passed the previous section. Also, make sure you are using the transpose of TensorFlow (very important) for both, labels and logits. And, from_logits=True.

Hi, thanks indeed Saif for you valuable input, it helped me to fix it and trust now all is in good order.