Problem in Assignment 5 of Improving Deep Neural Networks: Hyperparameter Tuning, Regularization and Optimization

Hello!

I have a problem with the computation of the cost in the function # GRADED FUNCTION: compute_cost of the last Assignment of Improving Deep Neural Networks: Hyperparameter Tuning, Regularization and Optimization.

I was sure I had no mistake, but the cost I obtain is 0.88275003 while the expected cost is 0.810287 and my function does not pass the tests. Can you verify, please, that there are no mistakes in the verification part? Also, the explanation part according to which we have to implement the function is confusing. Can I have a discussion with the instructor to clarify? Thank you! Ana

3 Likes

0.810287 is the right value. Please fix your code by paying attention to the shape of the function parameters and shape of parameters categorical_crossentropy expects.

1 Like

Hello!

I managed to solve it. In fact, I had to be careful at the parameters of the tf.keras.losses.categorical_crossentropy function and to understand that I had to use from_logits=True:) Otherwise, even though I was using the good shape, the value would have been too big. But now I solved, thank you!

Best,

Ana

The notebook expectation of 0.810287 is correct. Fix your invocation of tf.keras.losses.categorical_crossentropy

Posting code in a public topic is discouraged and can get your account suspended. It’s okay to share stacktrace on a public post and send code to a mentor via direct message. Please clean up the post. Here’s the community user guide to get started.

1 Like

Sorry I don’t know. I apologise…
Finally, I got the answer that I need to take transpose of the logits . To make the dimensions same or as it is in the labels.

1 Like

No worries. Thanks for removing your code.